Paper: Social Structure and Language Structure: the New Nomothetic Approach

Social Structure and Language Structure the New Nomothetic Approach

found via Gene Expression, blogs.discovermagazine.com/gnxp/2013/01/a-correlation-between-acacia-trees-and-tonal-languages/

Recent studies have taken advantage of newly available, large-scale, cross-linguistic data and
new statistical techniques to look at the relationship between language structure and social
structure. These ‘nomothetic’ approaches contrast with more traditional approaches and a
tension is observed between proponents of each method. We review some nomothetic studies
and point out some challenges that must be overcome. However, we argue that nomothetic
approaches can contribute to our understanding of the links between social structure and
language structure if they address these challenges and are taken as part of a body of mutu-
ally supporting evidence. Nomothetic studies are a powerful tool for generating hypotheses
that can go on to be corroborated and tested with experimental and theoretical approaches.
These studies are highlighting the effect of interaction on language.
Key words: nomothetic, social structure, complex adaptive systems, linguistic niche hypoth-
esis, cultural evolution

Polymaths, freedom of information, and copyright – why we need copyright reform to more effectively increase the number of polymaths

I forgot to mention that i hav riten a post about polymathy and copyriet reform over at Project Polymath. Reposted below. Direct link to post.

—-

Introduction
Polymaths are people with a deep knowledge of multiple academic fields, and often various other interests as well, especially artistic, but sometimes even things like tropical exploring. Here I will focus on acquiring deep knowledge about academic fields, and why copyright reform is necessary to increase the number of polymaths in the world.

Learning method
What is the fastest way to learn about some field of study? There are a few methods of learning, 1) listening to speeches/lectures/podcasts and the like, 2) reading, 3) figuring out things oneself. The last method will not work well for any established academic field. It takes too long to work out all the things other people have already worked out, if indeed it can be done at all. Many experiments are not possible to do oneself. But it can work out well for a very recent field, or some field of study that isn’t in development at all, or some field where it is very easy to work it things oneself (gather and analyze data). Using data mining from the internet is a very easy way to find out many things without having to spend money. However, usually it is faster to find someone else who has already done it. But surely programming ability is a very valuable skill to have for polymaths.

For most fields, however, this leaves either listening in some form, or reading. I have recently discussed these at greater length, so I will just summarize my findings here. Reading is by far the best choice. Not only can one read faster than one can listen, the written language is also of greater complexity, which allows for more information acquired per word, hence per time. Listening to live lectures is probably the most common way of learning by listening. It is the standard at universities. Usually these lectures last too long for one to concentrate throughout them, and if one misses something, it is not possible to go back and get it repeated. It is also not possible to skip ahead if one has already learned whatever it is the that speaker is talking about. Listening to recorded (= non-live) speech is better in both of these ways, but it is still much slower than reading. Khan Academy is probably the best way to learn things like math and physics by listening to recorded, short-length lectures. It also has built-in tests with instant feedback, and a helpful community. See also the book Salman Khan recently wrote about it.

If one seriously wants to be a polymath, one will need to learn at speeds much, much faster than the speeds that people usually learn at, even very clever people (≥2 sd above the mean). This means lots, and lots of self-study, self-directed learning, mostly in the form of reading, but not limited to reading. There are probably some things that are faster and easier to learn by having them explained in speech. Having a knowledgeable tutor surely helps in helping one make a good choice of what to read. When I started studying philosophy, I spent hundreds of hours on internet discussions forums, and from them, I acquired quite a few friends who were knowledgeable about philosophy. They helped me choose good books/texts to read to increase the speed of my learning.

Finally, there is one more way of listening that I didn’t mention, it is the one-to-one tutor-based learning. It is very fast compared to regular classroom learning, usually resulting in a 2 standard deviation improvement. But this method is unavailable for almost everybody, and so not worth discussing. Individual tutoring can be written or verbal or some mix, so it doesn’t fall under precisely one category of those mentioned before.

How to start learning about a new field
So, suppose one wants to learn something about a given field of study. Where to begin? Obviously, the best place to begin almost any study is the internet, especially Wikipedia. When one has read the article about the field on Wikipedia, one can then proceed to read the various articles referred in that article, or jump right into some of the sources listed. However, it is better to get ahold of a good textbook and learn from that. After all, textbooks are exactly the kind of book that is written to introduce one to a field of study. It would be very odd indeed if some other kind of book was better at introducing people to a field. That would mean that textbook authors had utterly and completely failed in their mission. I hammer this point through, because for some people, perhaps including some polymath aspirants, this fact is not obvious. Especially with philosophy, people have some strange idea that the best way to begin is reading huge, incomprehensible works (say, Being and Time), or just ‘start from the beginning’ with the pre-Socratics. See my post here. But it applies equally well to other fields. The best way to start learning physics is not to read Newton’s Principia.

Now, since polymaths need to learn a lot, and the preferred method of learning is reading, it follows that they need to read a lot. However, this can be an economic problem: Information is still costly to acquire. Polymaths are often dedicated to learning and spend their entire day learning (I spend >10 hours most days). So this means that having a job is not a viable solution. There isn’t enough time available. Thanks to the internet, there is now a wealth of information freely available. However, not all the information is freely available, and this presents a problem for would-be polymaths and already established polymaths who want to expand to another field of study. One could buy the material oneself, but this can quickly get expensive. One could lend the material from a library, but this requires that one reads paper books, which is not optimal, and also one cannot keep them around for future reference.

Primarily, there are two kinds of written sources that are not completely freely available yet, 1) journal articles, 2) books. Another less important source is newspaper articles.

Journals
Many polymath or stud.polymaths are university students or teachers and thus usually have access to academic journals through their university. However, often the university does not have access to all of the journals, and so if one stumbles upon an interesting paper which happens to be published in some obscure or perhaps defunct journal, it can be hard to find it. One can always try to ask the authors for the paper by email, and this often works, but again, not always. The authors may not want to help, they may be dead, or the email address mentioned out of order. This is clearly unsatisfactory for the polymath, whose curiosity is often insatiable. I know it annoys me very much whenever this happens.

Fortunately, journals are moving in the direction of open access, and the scientific community is increasingly unhappy with the way journals operate or used to operate. Usually researchers want their papers to be read, not hidden away behind a paywall. Even mainstream newspapers are writing about the issue. Countries and universities (Danish) are forcing their researchers to publish in open-access journals, or upload their papers to sites like arXiv or SSRN, where they can be freely downloaded. Internet activist Aaron Swartz also tried to liberate millions of papers recently, but was apparently unfortunately caught in the act. The absurd legal consequences of this act probably contributed to his reason to commit suicide. Still, the situation is improving quickly with respect to getting free access to the information in the journals.

If we legalized non-commercial copying of copyrighted works, then the situation would change almost instantly. Very quickly, companies like Google would make access to all academic papers ever published, at no cost at all to the user. This enormous improvement would of course not only help (stud.)polymaths, it would help anyone wanting to learn more. Most people are not university students or teachers, and so don’t have access to the academic journals. People who are unaffiliated with a university, polymaths or not, stand to win the most with such a change. A huge benefit to society at large.

Books
A lot of good information still exists only in paper book form, and books are prohibitively expensive for a non-wealthy polymath. I don’t consider myself extreme among polymaths, but I read something like >30 nonfiction books a year (reading list). Buying all of these is out of the question – much too expensive. Rare academic books can cost hundreds of dollars to buy in a paper copy. An absurd situation, and extremely unsatisfying for a polymath. It is possible to fight back, however. One can buy books and set them free. Either ebooks, crack the protection and spread them. Or paper books, scan them or have them scanned for you, and then release them.

Of course, a lot of books can be found in ebook versions for free, either legally or not. However, the situation has recently deteriorated due to the copyright industry (in this case, the book publishers) successfully shutting down several of the best illegal ebook downloading sites (specially library.nu was very good). Due to the way torrents work, they are ill-suited to handle the sharing of thousands of different books, although several sites have tried (and shut down again, perhaps due to legal pressure). Still, one can find millions of ebooks torrent, either in huge compilations of books about a given subject (e.g. this one is of interest to polymaths, or this, or this), or books in single torrents. Single book torrents are usually only for famous books. Useful at times, but not satisfactory at all.

To be sure, books that are out of copyright can often be found and downloaded legally at great sites such as Project Gutenberg. Surely, if the copyright duration was released, Gutenberg and other similar projects would immediately start working on making millions of more old books freely available. Getting books from Gutenberg and other sites like it is mostly useful for historical studies, and fields where the dating of the books matter less. E.g. in philosophy, there is still much to learn from reading Hume, or John Stuart Mill. But there isn’t that much to learn in empirical science from reading papers from the 17th century, except out of historical interest.

Google have already scanned millions of books. They are made somewhat available for free via the Google Books service, but copyright law demands (and settlements with the publishing industry) that parts of the books are left out. However, if copyright were changed tomorrow, what would happen is that Google would quickly unblock these parts of the books, making the information therein completely freely available. Google has already collaborated with various large libraries in scanning their books. When it comes to freedom of information, the internet pirates and libraries are on the same team. The internet is the world’s greatest library of culture and information. It will get much better when copyright law changes.

Summary
When copyright law changes, both books and academic papers will be free, and we will enter the true information age. The question is only a matter of time. This will benefit almost everybody, including polymaths. The losers will be the now obsolete middle-men. It will be much easier, especially for poor people and people not affiliated with a university to become polymaths, and of course for others to learn as well. At that time, only time, interest, and abilities will set the limit – not money.

Paper: The impact of genetic enhancement on equality (Michael H. Shapiro)

The impact of genetic enhancement on equality found via another paper: The rhetoric and reality of gap closing—when the “have-nots” gain but the “haves” gain even more (Stephen J. Ceci and Paul B. Papierno), which i was reading becus i was reading varius papers on Linda Gottfredson’s homepage.

Abstract:

There apparently  is a genuine  possibility  that  genetic and non-
genetic mechanisms eventually will  be able to  significantly  en-
hance  human capabilities and  traits generally.  Examining
this prospect  from the  standpoint  of equality considerations  is
one  useful way  to  inquire  into the  effects  of such enhancement
technologies. Because of  the nature and  limitations  of compet-
ing ideas  of equality, we are  inevitably led to  investigate  a very
broad  range  of issues.  This Article considers matters  of distri-
bution and withholding of scarce enhancement resources and
links different versions of  equality to different modes of distri-
bution.  It  briefly  addresses the  difficulties  of defining  “en-
hancement”  and  “trait”  and  links  the idea  of  a “merit  attribute”
to that of  a “resource  attractor.” The role of disorder-based  jus-
tifications  is related  to  equality considerations,  as is the possi-
bility  of  the  reduction or “objectification”  of persons  arising
from  the  use  of enhancement resources.  Risks of  intensified
and more entrenched  forms of  social  stratification  are outlined.
The Article also considers whether the notion of merit can  sur-
vive,  and whether the stability  of democratic  institutions  based
on  a one-person, one-vote  standard is  threatened by  attitude
shifts given  the new  technological  prospects.  It  refers to  John
Stuart Mill’s “plural  voting” proposal to  illustrate one  chal-
lenge to equal-vote  democracy.

Nevertheless, it is conceivable that, despite rigorous division of

labor, there may be political and social equality of a sort. Different

professions, trades, and occupations and the varying aptitudes un-

derlying them might be viewed as equally worthy. The “alphas”

may be held equal to the “betas,” though their augmentations (via

the germ line or the living body) and life-work differ. Perhaps

(paradoxically?) there will be an “equality of the enhanced” across

their categories of enhancement. But do not count on it.

 

sort of. at least one study showed that nootropics have greater effect the lower the intelligence of the population. so, in theory, it is possible that at some theoretical maximum M relative to drug D, the drug wud hav no effect. and everybody under that M wud be boosted to M, given adequate volumes of D.

 

i did come across another study with this IQ-drug interaction effect once, but apparently i didnt save it on my computer, and i cant seem to find it again. it is difficult to find papers about exactly this it seems.

 

below is a figure form the study i mentioned abov. it is about ritalin:

 

Effects of methylphenidate (ritalin) on paired-associate learning and porteus maze performance in emotionally disturbed children.

 

 

somthing similar seems to be the case with modafinil, another nootropic. it wud be interesting to see if ther is any drug-drug interaction between ritalin and modafinil, specifically, whether they stack or not.

 

en.wikipedia.org/wiki/Modafinil#Cognitive_enhancement

 

here is the best study mentioned on Wikipedia: Cognitive effects of modafinil in student volunteers may depend on IQ

 

as for the topic of cognitiv enhancers in general, see this somewhat recent 2010 systematic overview. it appears that ritalin isnt a good cognitiv enhancer, but modafinil is promising for non-sleep deprived persons. Modafinil and methylphenidate for neuroenhancement in healthy individuals a systematic review

 

 

a. Enhancement and democratic theory: Millian plural voting

and the attenuation of democracy.

i. Kinds of democracy; is one-person, one-vote a defining char-

acteristic of democracy? Most persons now acknowledge that there

are stunning differences, both inborn and acquired, among individu-

als. Not everyone can be a physicist, novelist, grandmaster, astro-

naut, juggler, athlete, or model, at least without enhancement, and

those who can will vary sharply among themselves in abilities.

 

For better or worse, these differences make for serious social,

economic, and political inequalities. The question here is what ef-

fect these differences in human characteristics ought to have on

various matters of political governance. If we are not in fact equal

to each other in deliberative ability, judgment, and drive, why do we

all have equal voting power in the sense that, when casting ballots

in general elections, no one’s vote counts for more than another’s?

We are not equal in our knowledge of the issues, our abilities to as-

sess competing arguments, the nature and intensities of our prefer-

ences, our capacities to contribute to our social and economic sys-

tem, our stakes in the outcomes of particular government policies, or

even in our very interest in public affairs.

 

 

this topic was the primary reason i started reading this paper.

 

i also found som other papers dealing with Millian meritocracy, i suppose one cud call it. i came upon the idea individually, but was preceded by JS Mill with about 200 years.

his writing on the subject is here: John Stuart Mill – Considerations on Representative Government

 

another paper i found is this: Why Not Epistocracy

 

 

 

Why Open Access is a moral imperative

i hav riten about it befor, but here is som mor i came acros.
Found via:
www.techdirt.com/articles/20130117/03040821712/scientist-explains-why-putting-research-behind-paywall-is-immoral.shtml

For general public:
www.guardian.co.uk/science/blog/2013/jan/17/open-access-publishing-science-paywall-immoral
www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=417576&c=1
www.guardian.co.uk/science/2012/jul/15/free-access-british-scientific-research

Background material for the curious:
Blog series:
svpow.com/2009/06/01/choosing-a-journal-for-the-neck-posture-paper-why-open-access-is-important/
svpow.com/2011/09/29/researchers-stop-doing-free-work-for-non-open-journals/
svpow.com/2011/10/17/collateral-damage-of-the-non-open-reviewing-boycott/
svpow.com/2012/01/30/what-is-a-private-sector-research-work/
svpow.com/2012/03/30/my-rcuk-submission/
svpow.com/2012/05/17/see-this-is-why-publishers-irritate-me-so-much/
svpow.com/2012/05/18/publishers-versus-everyone/
svpow.com/2012/10/16/publish-means-make-public-paywalls-are-the-opposite-of-publishing/

Other:
drvector.blogspot.dk/2007/02/reason-317-to-be-depressed-journal.html
www.scottaaronson.com/writings/journal.html
www.michaeleisen.org/blog/?p=911

Art Jensen on Galton + a review of The Legacy of His Ideas by Francis Galton (by Milo Keynes)

GALTON AND THE COMING OF EMPIRICAL PSYCHOLOGY
All the early influences on differential psychology mentioned so far came
from philosophers. None was an empirical scientist. Darwin was, of course, but
Darwinian ideas were introduced into psychology by Herbert Spencer, a pro­
fessional philosopher. The empirical study of mental ability and individual dif­
ferences could not begin until someone took up the methods of empirical
science, that is, asking definite questions of nature and discovering the answers
through analysis of data based on systematic observation, objective measure­
ment, and experimentation. The first person to do this was the Victorian eccen­
tric, polymath, and genius Sir Francis Galton (1822-1911).3 Galton was Charles
Darwin’s younger half-cousin—half-cousin because they had only one grand­
parent in common, Erasmus Darwin, a noted physician, physiologist, naturalist,
and poet. Born into a prominent and wealthy family, Galton was a child prodigy,
who could read and write before the age of four. He intensely disliked school,
however, and his parents transferred him from one private boarding school to
another, each as boring and frustrating to him as the others, and he begged his
parents to let him quit. In his Memories o f My Life (1908), written when he was
86, he still complained of his unsatisfying school experience. At age fifteen, he
was sent away to college, which offered more challenge. To satisfy his parents’
ambition that he follow in his eminent grandfather’s footsteps and become a
physician, he entered medical school. There he soon discovered that the basic
sciences—physics, chemistry, biology, and physiology—were far more to his
liking than medical practice. So he left medical school for Cambridge Univer­
sity, there to major in mathematics in preparation for a career in science.

Soon after Galton graduated, at age twenty-one, his father died, and Galton
received a large inheritance that made him independently wealthy for the rest
of his very long life. It allowed him to pursue his extremely varied interests
freely in all things scientific. His enthusiastic and catholic curiosity about natural
phenomena drove him to became perhaps the greatest scientific dilettante of all
time. Because he was also a genius, he made original contributions to many
fields, some of them important enough to be accorded chapters in books on the
history of several fields: criminology, eugenics, genetics, meteorology, psy­
chology, and statistics. He first gained fame in geography, as an explorer, ex­
pertly describing, surveying, and mapping previously unexplored parts of Africa.
For this activity, his name is engraved on the granite facade of the Royal Ge­
ographical Society’s building in London, along with the names of the most
famous explorers in British history. (His fascinating book  The Art o f Travel
[1855] was a long-time best seller and went through nine editions.) He also
made contributions to meteorology, inventing isobar mapping, being the first to
write a daily newspaper weather report, and formulating a widely accepted the­
ory of the anticyclone. He made other original contributions to photography,
fingerprint classification, genetics, statistics, anthropology, and psychometrics.
His prolific achievements and publications brought worldwide recognition and
many honors, including knighthood, Fellow of the Royal Society, and several
gold medals awarded by scientific societies in England and Europe. As a famous
man in his own lifetime, Galton also had what Hollywood calls “ star quality.”

Biographies of Galton also reveal his charming eccentricities. His profuse
intellectual energy spilled over into lesser achievements or activities that often
seem trivial. He was almost obsessed with counting and measuring things (his
motto: “When you can, count!” ), and he devised mechanical counters and other
devices to help in counting and tabulating. He loved data. On his first visit to
a city, for example, he would walk around with a small, hand-held mechanical
counter and tally the number of people passing by, tabulating their character­
istics—tall, medium, short; blond, brunette, redhead—separately for males and
females, the latter also rated for attractiveness. To be able to manage all these
data while walking about, he had his tailor make a special vest with many little
pockets, each one for a particular tabulated characteristic. He could temporarily
store the data from his counters by putting into designated pockets the appro­
priate number of dried peas. Back in his hotel room, he counted the peas in
each pocket and entered the numerical results in his notebook for later statistical
calculations.

He devised an objective measure of the degree to which a lecturer bored the
audience, and tried it out at meetings of the Royal Society. It consisted of
counting the involuntary noises—coughs, feet shuffling, and the like—that is­
sued from the audience, and, with a specially rigged protractor, he measured the
angle that listeners’ heads were tilted from a vertical position during the lecture.
A score derived from the data obtained with this procedure showed that even
the most eloquently written lecture, if read verbatim, was more boring than an
extempore lecture, however rambling and inelegant.

He also invented a special whistle (now called a Galton whistle), which is
familiar to many dog owners. Its high-frequency pitch is beyond humans’ au­
dible range and can be heard only by dogs and certain other animals. Galton
made a series of these whistles, ranging widely in pitch, and used them to find
the upper limits of pitch that could be heard by humans of different ages. To
compare the results on humans with the auditory capacities of many species in
the London Zoo, he would attach the whistles to the end of a tube that could
be extended like a telescope, so it could reach into a cage and direct the sound
right at the animal’s ear. While quickly squeezing a rubber bulb attached to one
end of the long tube to force a standard puff of air through the whistle attached
to the other end, he would note whether or not the animal reacted to a particular
pitch.

In another amusing project, he used the mathematics of solid geometry to
figure out the optimal way to cut a cake of any particular shape and dimensions
into any given number of pieces to preserve the freshness of each piece. He
published his clever solution in a mathematics journal. There are many other
quaint anecdotes about Galton’s amazing scientific curiosity and originality, but
the several already mentioned should suffice to round out the picture of his
extraordinary personality.

Although he died (at age ninety) as long ago as 1911, his legacy remains
remarkably vivid. It comprises not only his many pioneering ideas and statistical
inventions, still in use, but also the important endowments, permitted by his
personal wealth, for advancing the kinds of research he thought would be of
greatest benefit to human welfare. He founded the Department of Eugenics (now
Genetics) at the University of London and endowed its Chair, which has been
occupied by such luminaries as Karl Pearson, Sir Ronald Fisher, and Lionel
Penrose; he furnished a psychological laboratory in University College, London;
he founded two prestigious journals that are still active,  Biometrika and  The
Annals o f Human Genetics’, and he founded (in 1904) the Eugenics Society
(recently renamed The Galton Institute), which maintains an extensive library,
publishes journals and books, and sponsors many symposia, all related to the
field now known as social biology.

THE TWO DISCIPLINES OF SCIENTIFIC PSYCHOLOGY

Galton’s position in the history of behavioral science is stellar. He is ac­
knowledged as one of the two founding fathers of empirical psychology, along
with Wilhelm Wundt (1832-1920), who established the first laboratory of ex­
perimental psychology in 1879 in Leipzig. As Wundt is recognized as the father
of experimental psychology, Galton can certainly be called the father of differ­
ential psychology, including psychometrics and behavioral genetics. Each is now
a major branch of modern behavioral science. The leading historian of experi­
mental psychology, Edwin G. Boring (1950), drew the following interesting
contrast between the scientific personalities of Galton and Wundt:

Wundt was erudite where Galton was original; Wundt overcame massive obstacles
by the weight of his attack; Galton dispatched a difficulty by a thrust of insight.
Wundt was forever armored by his system; Galton had no system. Wundt was
methodical; Galton was versatile. Wundt’s science was interpenetrated by his
philosophy; Galton’s science was discursive and unstructured. Wundt was
interminably arguing; Galton was forever observing. Wundt had a school, a formal
self-conscious school; Galton had friends, influence and effects only. Thus, Wundt
was personally intolerant and controversial, whereas Galton was tolerant and ready
to be convicted of error, (pp. 461-62)

Wundt and Galton were the progenitors of the two main branches of scientific
psychology—experimental (Wundt) and differential (Galton). These two disci­
plines have advanced along separate tracks throughout the history of psychology.
Their methodological and even philosophical differences run deep, although
both branches embrace the scientific tradition of objective testing of hypotheses.

Experimental psychology searches for general laws of behavior. Therefore, it
treats individual differences as unwanted variance, termed “ error variance,”
which must be minimized or averaged out to permit the discovery of universal
regularities in the relation between stimulus and response. The method of ex­
perimental psychology consists of controlling variables (or treatment conditions)
and randomizing the assignment of subjects to the different treatments. The
experimental conditions are intentionally manipulated to discover their average
effects, unconfounded by individual differences. In general, the stimulus pre­
sented to the subject is varied by the experimenter, while the subject’s responses
are recorded or measured. But the data of primary interest to the experimental
psychologist consist of the averaged performance of the many subjects randomly
assigned to each condition.

Differential psychology, on the other hand, seeks to classify, measure, and
then explain the variety and nature of both individual and group differences in
behavioral traits as phenomena worthy of investigation in their own right. It uses
statistical analysis, such as correlation, multiple regression, and factor analysis,
applied to data obtained under natural conditions, rather than the controlled
conditions of the laboratory. Obviously, when human characteristics are of in­
terest, individual differences and many other aspects of behavior cannot feasibly
or ethically be controlled or manipulated by the investigator. Therefore, scien­
tists must study human variation as it occurs under natural conditions. During
the latter half of this century, however, a rapprochement has begun between the
two disciplines. Both experimental and correlational methods are being used in
the study of cognition.

G al to n ’s Methodological Contributions. Galton made enduring contribu­
tions to the methodology of differential psychology. He was the first to devise
a precise quantitative index of the degree of relationship, or  co-relation (as he
called it) between any two metric variables obtained from the same individuals
(or relatives) in a given population. Examples are individuals’ height and weight
or the resemblance between parents and children, or between siblings, in a given
trait.

In 1896, Karl Pearson (1857-1936), a noted mathematician, who became a
Galton disciple and has been rightly called the “ father of statistics,” revamped
Galton’s formulation of co-relation, to make it mathematically more elegant and
enhance its general applicability. Pearson’s formula yields what now is called
“ the Pearson product-moment coefficient of correlation.” In the technical lit­
erature, however, the word  correlation, without a modifier, always signifies
Pearson’s coefficient.4 (The many other types of correlation coefficient are al­
ways specified, e.g.,  intraclass correlation,  rank-order correlation,  tetrachoric
correlation,  biserial correlation,  point-biserial correlation,  partial correlation,
semipartial correlation,  multiple correlation,  canonical correlation, correlation
ratio, phi coefficient,  contingency coefficient,  tau coefficient,  concordance co­
efficient, and  congruence coefficient. Each has its specialized use, depending on
the type of data.) Pearson’s correlation is the most generally used. Universally
symbolized by a lower-case italic  r (derived from Galton’s term  regression), it
is a ubiquitous tool in the biological and behavioral sciences. In differential
psychology, it is absolutely essential.

Galton invented many other statistical and psychometric concepts and meth­
ods familiar to all present-day researchers, including the bivariate scatter dia­
gram, regression (related to correlation), multiple regression and multiple
correlation (by which two or more different variables are used to predict another
variable), the conversion of measurements or ranks to percentiles, standardized
or scale-free measurements or scores, various types of rating scales, the use of
the now familiar normal or bell-shaped curve (originally formulated by the great
mathematician Karl Friedrich Gauss [1777-1855]) as a basis for quantifying
psychological traits on an equal-interval scale, and using either the median or
the geometric mean (instead of the arithmetic mean) as the indicator of central
tendency of measurements that have a markedly skewed frequency distribution.

In his  Inquiries into Human Faculty and Its Development (1883), Galton
described an odd assortment of clever tests and techniques, devised mostly by
himself, for measuring basic human capacities, particularly keenness of sensory
discrimination in the different modalities, imagery, and reaction times to audi­
tory and visual stimuli. Although Galton’s use of gadgetry has been disparaged
as “ brass instrument psychology,” it was a seminal innovation—the  objective
measurement of human capacities. Compared with modern technology, of
course, Galton’s methods were fairly crude, sometimes even inadequate for their
purpose. His intense interest in human variation and his passion for quantitative
data, however, led him to apply his “ brass instrument” techniques to almost
every physical and mental characteristic that could be counted, ranked, or mea­
sured.

Galton obtained many types of data on more than 9,000 persons who, from
1884 to 1890, went through his Anthropometric Laboratory in London’s South
Kensington Science Museum. Each had to pay threepence to serve as subjects
for these tests and measurements. Unfortunately, Galton lacked the powerful
tools of statistical inference that were later developed by Karl Pearson (1857-
1936) and Sir Ronald A. Fisher (1890-1962), and therefore he could only draw
much weaker conclusions than the quality of his massive data really warranted.
He was dismayed that the measurements of sensory discrimination and speed of
reaction appeared to show so little relationship to a person’s level of general
mental ability (as indicated by educational and occupational attainments). It soon
became a widely accepted and long-lasting conclusion that the simple functions
assessed by Galton are unrelated to individual differences in the higher mental
processes, or intelligence. Galton’s “ brass instrument” approach to the study
of human abilities, therefore, was abandoned for nearly a century.

Recently, Galton’s original data have been analyzed by modern methods of
statistical inference.151 It turned out that his original hypotheses were largely
correct after all. R. A. Fisher’s method known as analysis o f variance revealed
highly significant differences between groups differing in educational and oc­
cupational level on Galton’s discrimination and reaction-time tests. Galton’s
scientific intuitions were remarkably good, but the psychometric and statistical
methods then available were not always up to the task of validating them.

Galton Introduces Genetics into Psychology. Galton’s most famous work,
Hereditary Genius (1869), was the forerunner of behavior genetics, nearly a
century before either the term or the field of behavior genetics came into being.
Galton was especially interested in the inheritance of mental ability. Because
there was then no objective scale for measuring mental ability, he devised an­
other criterion of high-level ability:  eminence, based on illustrious achievements
that would justify published biographies, encyclopedia articles, and the like. By
this criterion, he selected many of the most famous intellects of the nineteenth
century, whom he classed as “ illustrious,” and he obtained information about
their ancestors, descendants, and other relatives. His extensive biographical and
genealogical research revealed that the relatives of his illustrious probands were
much more likely to attain eminence than would a random sample of the pop­
ulation with comparable social background. More telling, he noticed that the
probability of eminence in a relative of an illustrious person decreased in a
regular stepwise fashion as the degree of kinship was more remote. Galton
noticed that the same pattern was also true for physical stature and athletic
performance.

Galton made other observations that gave some indication of the power of
family background in producing eminence. In an earlier period of history, it was
customary for popes to adopt orphan boys and rear them like sons, with all the
advantages of culture and education that papal privilege could command. Galton
noted that far fewer of these adopted boys ever attained eminence than did the
natural sons of fathers whose eminence was comparable to a pope’s. From such
circumstantial evidence, Galton concluded that mental ability is inherited in
much the same manner, and to about the same degree, as physical traits.

Galton further concluded that what was inherited was essentially a  general
ability, because eminent relatives in the same family line were often famous in
quite different fields, such as literature, mathematics, and music. He supposed
that this hereditary general ability could be channeled by circumstance or interest
into different kinds of intellectual endeavor. He also recognized special abilities,
or talent, in fields like art and music, but considered them less important than
general ability in explaining outstanding accomplishment, because a high level
of general ability characterized all of his illustrious persons. (Galton noted that
they were also characterized by the unusual zeal and persistence they brought
to their endeavors.) He argued, for example, that the inborn musical gift of a
Beethoven could not have been expressed in works of genius were it not ac­
companied by superior general ability. In Hereditary Genius, he summarized his
concept of general ability in his typically quaint style: “ Numerous instances
recorded in this book show in how small a degree eminence can be considered
as due to purely special powers. People lay too much stress on apparent spe­
cialities, thinking that because a man is devoted to some particular pursuit he
would not have succeeded in anything else. They might as well say that, because
a youth has fallen in love with a brunette, he could not possibly have fallen in
love with a blonde. As likely as not the affair was mainly or wholly due to a
general amorousness” (p. 64).

Ga l to n ’s Anecdotal Report on Twins. The use of twins to study the inher­
itance of behavioral traits was another of Galton’s important “ firsts.” He noted
that there were two types of twins, judging from their degree of resemblance.
“ Identical” twins come from one egg (hence they are now called monozygotic,
or MZ, twins), which divides in two shortly after fertilization. Their genetic
makeup is identical; thus their genetic correlation is unity (r = 1). And they are
very alike in appearance. “ Fraternal” twins (now called dizygotic, or DZ) come
from two different fertilized eggs and have the same genetic relationship as
ordinary siblings, with a genetic correlation of about one-half (on average). That
is, DZ twins are, on average, about one-half as similar, genetically, as MZ twins.
DZ twins are no more alike in appearance than ordinary siblings when they are
compared at the same age.

Galton was interested in twins’ similarities and differences, especially in MZ
twins, as any difference would reflect only the influence of environment or
nongenetic factors. He located some eighty pairs of twins whose close physical
resemblance suggested they were MZ, and he collected anecdotal data on their
behavioral characteristics from their relatives and friends and from the twins
themselves. He concluded that since the twins were so strikingly similar in their
traits, compared to ordinary siblings, heredity was the predominant cause of
differences in individuals’ psychological characteristics.

Because Galton obtained no actual measurements, systematic observations, or
quantitative data, his conclusions are of course liable to the well-known short­
comings of all anecdotal reports. Later research, however, based on the more
precise methods of modern psychometrics and biometrical genetics, has largely
substantiated Galton’s surmise about the relative importance of heredity and
environment for individual differences in general mental ability. But Galton’s
research on heredity is cited nowadays only for its historical interest as the
prototype of the essential questions and methods that gave rise to modern be­
havioral genetics. It is a fact that most of the questions of present interest to
researchers in behavioral genetics and differential psychology were originally
thought of by Galton. His own answers to many of the questions, admittedly
based on inadequate evidence, have proved to be remarkably close to the con­
clusions of present-day researchers. In the history of science, of course, the
persons remembered as great pioneers are those who asked the fundamental
questions, thought of novel ways to find the answers, and, in retrospect, had
many correct and fruitful ideas. By these criteria, Galton unquestionably quali­
fies.

Ga l to n ’s Concept of Mental Ability. Galton seldom used the word  intelli­
gence and never offered a formal definition. From everything he wrote about
ability, however, we can well imagine that, if he had felt a definition necessary,
he would have said something like  innate, general, cognitive ability. The term
cognitive clearly distinguishes it from the two other attributes of Plato’s triarchic
conception of the mind, the affective and conative. Galton’s favored term, men­
tal ability, comprises both general ability and a number of special abilities—he
mentioned linguistic, mathematical, musical, artistic, and memorial. General
ability denotes a power of mind that affects (to some degree) the quality of
virtually everything a person does that requires more than simple sensory acuity
or sheer physical strength, endurance, dexterity, or coordination.

Analogizing from the normal, bell-shaped distribution of large-sample data
on physical features, such as stature, Galton assumed that the frequency distri­
bution of ability in the population would approximate the normal curve. He
divided the normal curve’s baseline into sixteen equal intervals (a purely arbi­
trary, but convenient, number) to create a scale for quantifying individual and
group differences in general ability. But Galton’s scale is no longer used. Ever
since Karl Pearson, in 1893, invented the  standard deviation, the baseline of
the normal distribution has been interval-scaled in units of the standard devia­
tion, symbolized by c (the lower-case Greek letter sigma). Simple calculation
shows that each interval of Galton’s scale is equal to 0.696o, which is equivalent
to 10.44 IQ points, when the o of IQ is 15 IQ points. Hence Galton’s scale of
mental ability, in terms of IQ, ranges from about 16 to 184.

Galton was unsuccessful, however, in actually  measuring individual differ­
ences in intelligence. We can easily see with hindsight that his particular battery
of simple tests was unsuited for assessing the higher mental processes that peo­
ple think of as “ intelligence.” Where did Galton go wrong? Like Herbert Spen­
cer, he was immensely impressed by Darwin’s theory of natural selection as the
mechanism of evolution. And hereditary individual variation is the raw material
on which natural selection works by, in Darwinian terms, “ selection of the fittest
in the struggle for survival.” Also, Galton was influenced by Locke’s teaching
that the mind’s content is originally gained through the avenue of the five senses,
which provide all the raw material for the association of impressions to form
ideas, knowledge, and intelligence. From Darwin’s and Locke’s theories, Galton
theorized that, in his words, “ the more perceptive the senses are of differences,
the larger is the field upon which our judgement and intelligence can act”
{Human Faculty, 1883, p. 19). Among many other factors that conferred advan­
tages in the competition for survival, individual variation in keenness of sensory
discrimination, as well as quickness of reaction to external stimuli, would have
been positively selected in the evolution of human intelligence.

It seemed to Galton a reasonable hypothesis, therefore, that tests of fine sen­
sory  discrimination (not just simple acuity) and of reaction time to visual and
auditory stimuli would provide objective measures of individual differences in
the elemental components of mental ability, unaffected by education, occupation,
or social status. The previously described battery of tests Galton devised for this
purpose, it turned out, yielded measurements that correlated so poorly with com-
monsense criteria of intellectual distinction (such as election to the Royal So­
ciety) as to be unconvincing as a measure of intelligence, much less having any
practical value. Statistical techniques were not then available to prove the the­
oretical significance, if any, of the slight relationship that existed between the
laboratory measures and independent estimates of ability. Galton had tested
thousands of subjects, and all of his data were carefully preserved. When re­
cently they were analyzed by modern statistical methods, highly significant (that
is, nonchance) differences were found between the  average scores obtained by
various groups of people aggregated by age, education, and occupation.151 This
finding lent considerable theoretical interest to Galton’s tests, although they
would have no practical validity for individual assessment.

Binet and the F irs t Practical Test of Intelligence. At the behest of the Paris
school system, Alfred Binet in 1905 invented the first valid and practically useful
test of intelligence. Influenced by Galton and aware of his disappointing results,
Binet (1857-1911) borrowed a few of Galton’s more promising tests (for ex­
ample, memory span for digits and the discrimination of weights) but also de­
vised new tests of much greater mental complexity so as to engage the higher
mental processes—reasoning, judgment, planning, verbal comprehension, and
acquisition of knowledge. Test scores scaled in units of mental age derived from
Binet’s battery proved to have practical value in identifying mentally retarded
children and in assessing children’s readiness for schoolwork. The story of Bi­
net’s practical ingenuity, clinical wisdom, and the lasting influence of his test
is deservedly well known to students of mental measurement.171 The reason that
Binet’s test worked so well, however, remained unexplained by Binet, except
in intuitive and commonsense terms. A truly theory-based explanation had to
wait for the British psychologist Charles Spearman (1863-1945), whose mo­
mentous contributions are reviewed in the next chapter.

Galton on Race Differences in Ability. The discussion of Galton’s work in
differential psychology would be incomplete without mentioning one other topic
that interested him—race differences in mental ability. The title itself of his
chapter on this subject in  Hereditary Genius would be extremely unacceptable
today: “ The Comparative Worth of Different Races.” But Galton’s style of
writing about race was common among nineteenth-century intellectuals, without
(he slightest implication that they were mean-spirited, unkindly, or at all un­
friendly toward people of another race. A style like Galton’s is seen in state­
ments about race made by even such democratic and humanitarian heroes as
Jefferson and Lincoln.

Galton had no tests for obtaining direct measurements of cognitive ability.
Yet he tried to estimate the mean levels of mental capacity possessed by different
racial and national groups on his interval scale of the normal curve. His esti­
mates—many would say guesses—were based on his observations of people of
different races encountered on his extensive travels in Europe and Africa, on
anecdotal reports of other travelers, on the number and quality of the inventions
and intellectual accomplishments of different racial groups, and on the percent­
age of eminent men in each group, culled from biographical sources. He ven­
tured that the level of ability among the ancient Athenian Greeks averaged “ two
grades” higher than that of the average Englishmen of his own day. (Two grades
on Galton’s scale is equivalent to 20.9 IQ points.) Obviously, there is no pos­
sibility of ever determining if Galton’s estimate was anywhere near correct. He
also estimated that African Negroes averaged “ at least two grades” (i.e., 1.39a,
or 20.9 IQ points) below the English average. This estimate appears remarkably
close to the results for phenotypic ability assessed by culture-reduced IQ tests.
Studies in sub-Saharan Africa indicate an average difference (on culture-reduced
nonverbal tests of reasoning) equivalent to 1.43a, or 21.5 IQ points between
blacks and whites.8 U.S. data from the Armed Forces Qualification Test (AFQT),
obtained in 1980 on large representative samples of black and white youths,
show an average difference of 1.36a (equivalent to 20.4 IQ points)—not far
from Galton’s estimate (1.39a, or 20.9 IQ points).9 But intuition and informed
guesses, though valuable in generating hypotheses, are never acceptable as ev­
idence in scientific research. Present-day scientists, therefore, properly dismiss
Galton’s opinions on race. Except as hypotheses, their interest is now purely
biographical and historical.

NOTE 3

3. The literature on Galton is extensive. The most accessible biography is by Forrest
(1974). Fancher (1985a) gives a shorter and highly readable account. A still briefer
account of Galton’s life and contributions to psychology is given in Jensen (1994a),
which also lists the principal biographical references to Galton. His own memoir (Galton,
1908) is good reading, but does not particularly detail his contributions to psychology,
a subject reviewed most thoroughly by Cyril Burt (1962). Galton’s activities in each of
the branches o f science to which he made original contributions are detailed in a collec­
tion o f essays, each by one o f fourteen experts in the relevant fields; the book also
includes a complete bibliography o f Galton’s published works, edited by Keynes (1993).
Fancher (1983a, 1983b, 1983c, 1984) has provided fascinating and probing essays about
quite specific but less well-known aspects o f Galton’s life and contributions to psychol­
ogy. Lewis M. Terman (1877-1956), who is responsible for the Stanford-Binet IQ test,
tried to estimate Galton’s IQ in childhood from a few of his remarkably precocious
achievements even long before he went to school. These are detailed in Terman’s (1917)
article, in which he concluded that Galton’s childhood IQ was “ not far from 200” (p.
212). One o f Galton’s biographers, Forrest (1974), however, has noted, “ Terman was
misled by Francis’ letter to [his sister] Adele which begins, ‘I am four years old.’ The
date shows that it was only one day short of his fifth birthday. The calculations should
therefore by emended to give an I.Q. of about 160” (p. 7). (Note: Terman estimated IQ
as 100  X  estimated Mental Age (MA)/Chronological Age (CA); he estimated Galton’s
MA as 8 years based on his purported capabilities at CA 5 years, so 100 x 8/5 = 160.)

(all from The g factor, the science of mental ability – Arthur R. Jensen,, chapter 1).

The Keynes book is: The Legacy of His Ideas  by Francis Galton; ed. Milo Keynes.

I found a review of it, here: Sir Francis Galton, FRS The legacy of his ideas. review

I was particular struck by this:

Some contributors  suggest  that  he spread  himself  too  thinly:  that  he did  too many
things and followed up too few. Perhaps  so, but many great  scientists have been
polymaths.  Could  it be something  more  insidious?  That  his major  work  has become
too politically incorrect  to mention?

I am much like Galton, except that im not that smart. I seem to be around 2.3sd above the white mean, but share his mental energy and diverse interests.

Kennethamy on voluntarianism about beliefs

From here. btw this thread was one of the many discussions that helped form my views about what wud later become the essay about begging the question, and the essay about how to define “deductive argument” and “inductive argument”.

Reconstructo

 Do you know much about Jung’s theory of archetypes? If so, what do you make of it?

Kennethamy

 I don’t make much of Jung. Except for the notions of introversion and extroversion. Not my cup of tea. As I said, we don’t create our own beliefs. We acquire them. Beliefs are not voluntary.

Emil

 They are to some extend but not as much as some people think (Pascal’s argument comes to mind).

Kennethamy

 Yes, it does. And that is an issue. His argument does not show anything about this issue. He just assumes that belief is voluntary He does talk about how someone might acquire beliefs. He advises, for instance, that people start going to Mass, and practicing Catholic ritual. And says they will acquire Catholic beliefs that way. It sounds implausible to me. It is a little like the old joke about a well-known skeptic, who puts a horseshoe on his door for good luck. A friend of his sees the horseshoe and says, “But I thought you did not believe in that kind of thing”. To which the skeptic replied, “I don’t, but I hear that it works even if you don’t believe it”.

 

Kennethamy on ordinary language filosofy, and ‘deep, profound questions’

From here.

Kennethamy

Frankly I cannot answer your question about Lancan because I really don’t understand what he is saying. However, let me ask you, in turn, what you think about the following quotation from Wittgenstein’s Philosophical Investigations. I think it is relevant to this discussion.

We are under the illusion that what is peculiar, profound, essential in our investigation, resides in its trying to grasp the incomparable essence of language. That is, the order existing between the concepts of proposition; word, proof, truth, experience, and so on. This order is a super-order between – so to speak – super-concepts. Whereas, of course, if the words “language,” “experience,” “world,” have a use, it must be as humble a one as that of the words “table,” “lamp,” “door.” (p. 44e)

Emil

It is funny that you bring up W. in this, Ken, as he wrote most incomprehensibly! Perhaps he was doing analytic philosophy but it is certainly extremely hard to understand anything he wrote. It’s not like reading Hume which is also hard to understand. H. is hard to understand because the texts he wrote were written 250 years ago or so. W. wrote only some 70-50 years ago and yet I can’t understand it easily. I can understand other persons from the same era just fine (Clifford, W. James, Quine, Russell, etc.).

Kennethamy

W. wrote aphoristically (Like Lichtenstein) so you have to get used to his style. But what of the passage. Do you understand that?

Emil

No, I have no clue what it means. I didn’t read PI yet so maybe that is why. I read the Tractatus.

Kennethamy

Well, he says that philosophers should not think that words like, “knowledge” or “reality” have a different kind of meaning than, and need a different kind of understanding from, ordinary words like “lamp” and “table”. “Philosophical” words are not special. Their meanings are to be discovered in how they are ordinarily used. (That does not, I think, suppose you have read, PI).

Emil

Alright. Then why didn’t he just write what you just wrote? I suppose this is the paradigmatic thesis of the ordinary language philosophy.

Kennethamy

First of all it was in German. And second, it wasn’t his style. But I don’t think it was particularly hard to get that out of it. Yes, it is ordinary language philosophy. But, going beyond interpretation (I hope) don’t you think it is true? Why should “knowledge” (say) be treated differently from “lamp”?

Emil

I think it is. Especially for a person that hasn’t read much of W.’s works. You have read a lot more than I have.

I agree with it, yes.

Kennethamy

There are lots of people who think that words like “knowledge” and “information” are superconcepts which have a special philosophical meaning they do not have in ordinary discourse (and which it is beneath philosophy to treat like the word, “lamp”) That’s why they are interested in what some particular philosopher means by, “knowledge”. They think there is some “incomparable essence of language” that philosophers are “trying to grasp”.

Emil

Ok. But some words do have meanings in philosophical contexts that they do not have in other, normal contexts. Think of “valid” as an example.

Kennethamy

Yes, of course. But in that sense, “valid” is a technical term. “Knowledge” is not a technical term in the ordinary sense. It doesn’t have some deep philosophical meaning in addition to its ordinary meaning, nor is its ordinary meaning some deep meaning detached from its usual meaning. What meaning could Lacan find that was the real philosophical meaning? Where would that meaning even come from? Heidegger does the same thing. He ignores what a word means, and then finds (invents” a deep philosophical meaning for it. But he uses etymology to do that. It is wrong-headed from the word “go”. If you read Plato’s Cratylus you find how Socrates makes fun of this view of meaning (although, Plato here is making fun of himself, because he really originates this idea that the meaning of a word is its essence which is hidden).

Wittgenstein’s positive point is, of course, the ordinary language thing. But his negative point (which I think is more important for this discussion) is that terms like “knowledge” or “truth” do not have special meanings to be dug out by philosophers who are supposed to have some special factual for spying them. Lancan has no particular insight into the essence of knowledge hidden from the rest of us which, if we understand him, will provide us with philosophical enlightenment. Why should he?

Jeeprs

@kennethamy,
There is a risk in all of this that by excluding the idea of the ‘super concept’ in W’s sense, or insisting that it must simply have the same kind of meaning as ‘lamp’ or ‘table’ that you also exclude what is most distinctive about philosophy. Surely we can acknowledge that there is a distinction between abstract and concrete expression. ‘The lamp is on the table’ is a different kind of expression to ‘knowledge has limits’.

When we ‘discuss language’ we are on a different level of explanation to merely ‘using language’. I mean, using language, you can explain many things, especially concrete and specific things, like ‘this is how to fix a lamp’ or ‘this is how to build a table’. But when it comes to discussing language itself, we are up against a different order of problem, not least of which is that we are employing the subject of the analysis to conduct the analysis. (I have a feeling that Wittgenstein said this somewhere.)

So it is important to recognise what language is for and what it can and can’t do. There are some kinds of speculations which can be articulated and might be answerable. But there are others which you can say, but might not really be possible to answer, even though they seem very simple (such as, what is number/meaning/the nature of being). Of which Wittgenstein said, that of which we cannot speak, of that we must remain silent. So knowing what not to say must be part of this whole consideration.

Kennethamy

“Lamp” is a term for a concrete object. “Knowledge” is a term for an abstract object. But the central point is that neither has a hidden meaning that only a philosopher can ferret out. The meaning of both are their use(s) by fluent speakers of the language. It is not necessary to go to Lancan or Nietzsche to discover what “knowledge” really means anymore that it is to discover what “lamp” really means. As Wittgenstein wrote, “nothing is hidden”. Philosophy is not science. It is not necessary to go underneath the phenomena to discover what there really is. It is ironic that interpretationists accuse analytic philosophy of “scientism” when it is they who think that philosophy is a kind of science.

Reconstructo

@kennethamy,
I interpret Wittgenstein as saying that the philosophical language-game is not a privileged language game. To say that something isn’t hidden is not to say that everyone finds it. This is just figurative language. Wittgenstein should be read by the light of Wittgenstein. His game is one more game, the game of describing the game. I interpret him as shattering the hope (for himself and those whom he persuades) for some unified authority on meaning.
Also he stressed the relationship of language and social practice. He finally took a more holistic view of language, and dropped his reductive Tractatus views. (This is not to deny the greatness of the Tractatus. Witt is one of my favorites, early and late.)
I associate Wittgenstein with a confession of the impossibility of closure. I don’t think language is capable of tying itself up.

Kennethamy

To say that “nothing is hidden” is to say that words like “truth” or “knowledge” do not have, in addition to their ordinary everyday meanings, some secret meanings that only philosophers are able to discover. There are no secret meanings. There is no, “what the word ‘really means'” that Lacan or Heidegger has discovered.

————————-

Jeeprs

Well my reason is that a lot of what goes on in this life seems perfectly meaningless and in the true sense of the word, irrational. Many things which seem highly valued by a lot of people seem hardly worth the effort of pursuing, we live our three score years and ten, if we’re lucky, and then vanish into the oblivion from whence we came. None of it seems to make much sense to me. I am the outcome, or at least an expression, of a process which started billions of years ago inside some star somewhere. For what? Watch television? Work until I die?

That’s my reason.

Kennethamy

Just what are you questioning? (One sense of the word, “meaningless” may well be something like “irrational”. But that is not the true sense of the word. What about all the other senses of the word, “meaningless”? ). By the way, I think that “non-rational” would be a better term than “irrational”. And, just one more thing: what would it be for what goes on in this world to be rational? If you could tell me that, then I would have a better idea of what it is you are saying when you say it is irrational or it is non-rational. What is it that it is not? What would it be for you to discover that what goes on is rational?

Jeeprs

Have you ever looked out at life and thought ‘boy what does it all mean? Isn’t there more to it than just our little lives and personalities and the things we do and have?’ You know, asked The Big Questions. That’s really what I see philosophy as being. So now I am beginning to understand why we always seem to be arguing at cross purposes.

Dunno. Maybe I shouldn’t say this stuff. Maybe I am being too personal or too earnest.

Kennethamy

In my opinion, it is the belief that philosophers are supposed to ask only the Big Questions that partly fuels the view that philosophy gets nowhere and is a lot of nonsense, and is a big waste of time. And that would be right if that is what philosophy is.

Where would science have got if scientists had not rolled up their sleeves and asked many little questions.

Jeeprs

@kennethamy,
from what I know of Heidegger, I very much admire his philosophy. There are many philosophers I admire, and many of them do deal with profound questions; and I know there are many kindred spirits on the forum. But – each to his own, I don’t want to labour the point.

Kennethamy

How about “deal with seemingly profound questions”? But one of the philosopher’s seminal jobs is to ask whether a seemingly profound question is really all that profound, and what the question means, and supposes is true.Philosophers should have Hume’s “tincture of scepticism” even in regard to questions.

Kennethamy on the analytic principle of analyzing questions

From here.

Kennethamy

So many threads ask whether this or that is logical. Is probabllity logical? Are moral arguments logical? And so on. I never know what it is being asked by such questions. Is there something clear and specific that is being asked by the question, is X logical? What is it?

Kroni

@kennethamy,
Maybe they’re asking if it can be identified through premises and conclusions…Or maybe they are trying to figure out if abstract concepts like morality follow some kind of mathematical pattern or have a logical purpose for existing.

Emil

What does “logical purpose” mean?

Kennethamy

I imagine it might be asking whether the purpose is something that can be accomplished, or whether the purpose is worth accomplishing. The trouble is that it can mean so many different things that the question, is it logical? does not convey anything really being asked.

So, rather than simply ask whether X is logical, why not, instead, ask about the problem you have in mind when you asked the question. And, maybe if you think about what the problem is, and cannot come up with anying specifice or clear, maybe you will wait to ask the question, or maybe not ask the question at all.

Emil

Basically the analytic principle of questions. Always start by analyzing the question.

Kennethamy

It was a great advance in philosophy when it was understood that philosophical questions had to be analyzed to determine what they were asking, or whether they were asking anything sensible, before trying to answer them. In the sciences, it is taken for granted that the important thing is to answer the questions. But it took some time to recognize that was not true in the case of philosophy.