Thursday, April 30, 2009

[whole wide world] two versions

A second version I quite like.

It's no crime to dream.

[thoughtful thursday] anyone for ...

[data protection] snippets of shoddy practice

Sometimes it’s as well to go give yourself an hour and wade through inter-governmental reports, commissions on the reports and reports on the commissions, along with further analysis.

The movers and shakers rely on making the watchdog scrutiny of policy decisions like finding a needle in a haystack and the serious blogger who does take the time and writes his 32 page report on it – a DK or Unity, for example, is pushing faeces uphill to get the wider public to see, to understand and to accept this.

Below is a small example. Imagine you had a report on:

… also on:

… and the Statewatch report on that. Imagine you woke up, say, at 05:45 and for some crazy reason, saw a folder within a folder within a folder, containing these reports and you began to read them again. Imagine you came out with these snippets:

What conclusion could you come to?

Correct me if I’m wrong but this blogger concludes that despite Euro directives already in place, despite ongoing data protection discussions and despite the existence of Euro-courts and the like, a group of interior ministers had a meeting with a view to rushing through data availability, cross border data collection/sharing and our government went along with and supported this.

The problem with it is that the meeting was outside of EU law and yet the conclusions went back to National Parliaments and were pushed through without any discussion, without publicizing that it was even happening and presenting the bills as faits accompli, all amidst calls for greater transparency from others.

The bottom line is that your data in the UK is shared with other EU member nations, with no checks and balances and it’s at their whim if it gets out to the wider world. Thank you so much, Gordo’s warlocks and harpies.

I haven’t gone into any detail on the issue itself in this post and the snippets are hardly conclusive, merely indicative and yet, am I wrong in concluding that there is some very shoddy practice going on here which would have us subject to disciplinary proceedings and out on our ear if we were to try this on?

Another thing

One of the most frustrating things for the poor blogger who is even halfway serious is that he can bring the material to the readership [I’m referring to other poor sods, not myself here] and answer both serious and spurious questions on it in his comments section but how many people have actually waded through the material and how many see a block of print and tune out, rendering the work he’d put in to it well nigh irrelevant?

Sonus must feel frustrated that way.

How many bloggers write long winded posts, which is not to stay that the research was not impeccable and the conclusions not sound, only to turn around and click out of anyone else’s post if it goes much beyond five paragraphs, not really interested in the other’s take on it?

I’m still stunned by the blogger who went to … I forget, was it Mr. Eugenides [?] … and in response to an excellent post, commented, ‘My view is here,’ complete with hyperlink. He was, at least, being nakedly honest that he didn’t give a rat’s back passage what Mr. E had written but on the other hand, we readers were meant to go to his site and read his morsels of wisdom.

My unstated response to this man at the time was, ‘You can F off.’

Similarly, some years back, I had a friend who, in response to any statement I made, any statement at all, would automatically open his response with, ‘No,’ or ‘Not at all,’ or ‘You’re wrong there,’ thus dismissing everything I’d just said holus bolus. I’m sure he didn’t actually dismiss all, it was just his unfortunate mannerism.

He’d then proceed to state his own case, at least tacitly acknowledging them in his argument but as that argument was now couched in his own terms, it now had the official stamp of truth to it. When I’d point out that this had been largely what I’d been saying, he’d reply, ‘Not at all.’

Eventually he began saying, ‘I accept that but –’ which had the effect on me of at least wishing to continue dialogue with him for some years more.

None of which solves the problem of the gangsters at the top in our society.

Here's an interesting post on the biometrics and related issues.

Wednesday, April 29, 2009

[musos] jj and eric

[wordless wednesday] when eagles fly

[education destroyed] further comment

Spot on IMHO. Please read all of it.

The Tories, to be fair, are already making noises about taking control of education away from the centre and handing it back to local communities. The trouble is that the "education community" at every level has been indoctrinated with "progressive" ideas for over a century. If you want radical thinking on education, you need to listen to parents, not educators. The "professionals" have been divorced from their true "clients" (the parents) for so long, that they really don't think of their views as important.

But ultimately it is the parents who have a responsibility to educate their children. State education began as a laudable attempt to ensure even the poorest could fulfil that responsibility. It has ended in the usurpation of their role. If parents were allowed to apply common sense to education, I guarantee it would be "learning-centred" not "child-centred" and that it would ask more of students than the educational establishment currently does.

[sonus] understanding issues: part 7

Let it never be said that alternative views are suppressed at this site. Take it away, Sonus:

I have written this in direct response to James.

Heliopolis was the Sun Centre of Egypt, but this was its Greek name. The Israelites called the place On, but to the Egyptians it was known as Annu, from which the Latin-English word “annum”, (year) derives. In Akkadian Mesopotamia, Annu, or Anum, had been the equivalent of Ra, whereas in Sumer, (Southern Mesopotamia), he was the great Sky God Anu.(Remember much earlier “Anunnaki”). It is for this reason that the winged disc of the Lord of The Sun is found in both Mesopotamia, and Egypt.

The determination of the earthly calender was said to be the prerogative of the Great Anu. An “annum” related to the Earth's solar orbit and was denoted by a “point within a circle”. It was called a “Sha”, - an ideogram of 360 degrees, stemming from “sha-at-am, which literally means “a passing”, with a 360 degree passing being defined as an orbit. (Even then, about 3,000BC and earlier it was understood that the Earth revolved around the Sun). The “Orbit of Light” was deemed to be the realm of the Sun God, and was thus defined as the “Sha-Ra-On”, (Sharon)

The transmitter of light (the light bearer) was the Rose of Sharon, the carrier of the “Rosi Crucis”. “Rosi” represented the “Ritu” (the redness of truth) and “Crucis” related to a cup, as in “crucible”. This was equivalent to the “Sacred Vessel of Light in Kabbalah”, and is why the mystical technology of the Zohar has been linked to the “Holy Grail”. The light bearer has been variously identified in different cultures, from Nin-Kharsag, (via Anunnaki), to Venus, and in that guise was the queen in the Old Testament Song of Solomon 2:1: ”I am the rose of Sharon......”. (I am the Truth of the Orbit of Light)

(But then again, all cultures involved here loved their puns, it was an art-form. They used puns to amuse, to explain, but also to hide “things” from those lacking comprehension)

An unforeseen problem that beset “The Royal Society”, the “Invisible College” and emergent Freemasonry in England in 1667, aided officially by the Stuart Monarchy, whose encouragement of scientific research, even alchemical research, and the (relative) encouragement of freedom of religion, was anathema to the elite Anglican church, being directly related to this “Sharon,” this “light,” this “enlightenment” aspect of their study. The Puritan faction in England was damaging to cultural pursuits in many ways, but most effectively and permanently by way of its own literary culture. As the Cromwellian movement set its sights fiercely against kingship and the Royal House of Stuart in military terms, so too did its writers. The puritan movement was threatening to all academic achievement of that time, with intellects such as Isaac Newton being constrained in their publication of their work (which in Newtons case had a large Alchemical content.

There is strong evidence that he solved the enigma of the “Philosophers Stone,” although the technology of that time would not allow for its physical replication. His unpublished papers were purchased at auction by John Maynard Keynes who referred to him as “The last of the Magicians; the last of the Babylonians; the last of the Sumerians,”) (Indeed, it would be the Vatican who financed William of Orange, at the request of the Anglican church in England, to remove the Stuart Monarchy, - which makes all the “troubles” in Northern Ireland seem totally misguided – some would say a sad indictment of state sponsored religion, others would be more worldly and say that it would be nothing more than expected.)

The London born poet, John Milton (1608-1674) was a powerful writer. “Paradise Lost” employed powerful rhetoric and dogma, and was a direct assault against the new philosophy of Christopher Wren, and others of the Royal Society. Ignoring Copernicus, Galileo, and the scientific discoveries of his day, Miltons cosmic vision centred on the traditional Christian belief that the Earth was at the centre of the universe.

Paradise Lost concerns the revolt of Satan, his fall from grace, and the establishment of Hell. At the time of publication the “Royal Society under Stuart encouragement was making remarkable discoveries and the Masonic Debate” was at its peak, with the Puritans and the “Kirk Presbytery” claiming that Royal Society members, and Masons were occultists with “second sight”, (evil eye) who could see the invisible, but worse, the Rosicrucian Academics of the Royal Society actually believed that the earth revolved round the Sun!! All were therefore also accused of being heretical sun-cultists!

Milton added more however.

The Old Testament Book of Isaiah, 14:12 prophesied the overthrow of Babylon's king, saying :”How are you fallen from Heaven, Day Star, Son of the Dawn”. As is made clear by the term “son of the dawn”, the Isaiah reference was to the King of Babylon, but astronomically the “day star”, or “morning star”, is Venus. In Latin, Venus, the “Light Bringer” was referred to as the “Lux-fer”, commonly written as Lucifer. Milton has deliberately, deceitfully, treated this descriptive feminine term as a proper noun (in accordance with St Jerome's “Vulgate” translation, as it appears in the Isaiah verse today). Additionally, in Paradise Lost, Lucifer was associated with Satan:

“Of Lucifer, so by allusion called,

Of that bright star to Satan paragon'd.”

Prior to 1667, Venus was “female” and the term Lucifer, (“lux-fer”, “light bringer”) had never been associated with a male entity, and certainly not with a satan.

Nathan Baileys Etymological Dictionary (1721-94) states : “Lucifer, - The Morning or Day Star; The Planet Venus when it rises before the Sun”.

Miltons work therefore declared that Royal Society members, the Invisible College, and Masons therefore were not just Sun Cultists, they were also Satanists!

Therefore, from 1667, Lucifer became an alternative name for Satan, while the “Lux-fer” associations with Venus, light bearer and Goddess of Love, were forgotten by way of religious clerical indoctrination and control. Significantly, three hundred years later the puritan view is still being regurgitated by hard line religious extremists, pursuing a modern day witch hunt, based on an entirely deliberate mistranslation. Indeed an entire film and electronic gaming industry has grown around this abherent thinking.

The dishonesty of the “Vulgate” translation is visible: The direct Greek equivalent to “Lux-fer”, is Phos phoros, from whence phosphorus derives. When this was used in the New Testament (2 Peter 1:19), it was translated as “day star”.

The Oxford English Dictionary gives Phosphorus as “relating to the morning star”, and it was applied to the Messiah Revelations 22:16 - “I am the root and offspring of David, and the Bright and Morning Star”.

The original term in Isaiah was not phos phoros but Hebrew “heylel”. Heylel relates to “boasting, boastful”. Isaiah should not read “How are you fallen from Heaven, day star, son of the dawn.” but “How are you fallen from Heaven, boastful one, son of the dawn”, that is, a direct reference to the King of Babylon, with no connection to Venus, or a light bearer of any kind. Milton, because of his religious views, deliberately created a monstrous falsehood which “religion,” - the Anglican and Roman Catholicism was able to capitalise on, in order to increase its “control” of a fearful population. Commercial interests similarly found it very lucrative, centuries later.

The puritan view, and established religion thus distorted original facts and writings, and created the belief that Lucifer, a “light bearer”, or more accurately in the case of Venus, a “light reflector”, was in fact another name for a Satan, a name that has extremely evil connotations. The motivation was entirely derived from Vatican/Catholic teachings that denied recent scientific discoveries, and sought to enforce their dogma, to strengthen their power base, by deliberate propagation of fear in a largely illiterate population, and acquiescence was enforced through fire and torture.

Since the view of the Puritan faction in England at that time seemed to fear Satan(s), and ascribe to them such terrible powers, a closer scrutiny of Satan(s) is in order..

The modern perception of Satan, as that of an evil imperialist whose despicable hordes wage war on God and humankind is an invention of the evangelical era, as are so many other painfully wrong concepts. Satan is a fabulous myth with no more authenticity than players in a Gothic novel.

Satans, though rarely mentioned in the Old Testament are generally portrayed as obedient servants or sons of God who perform specific duties of strategic obstruction The Hebrew root of the word satan is STN, which defines an opposer, adversary, or an accuser, whereas the Greek equivalent was “Diabolos” (from which derive the words diabolical and devil.), again meaning no more than obstructor or slanderer.

Until the Roman Christian era the term Satan had no sinister connotation whatsoever, and in biblical times, members of a political opposition party would have customarily been called “satans”

In the Old Testament, satans are seen as members of the heavenly court, - angels who carry out Gods more aggressive dictates.

In Job (1:6-12, 2:1-7) a satan is sent twice to tease and frustrate job, but with the express instruction that he should not harm the man- an instruction that is obeyed.

In 1 Chronicles21:1, a satan figure suggests that king avid should count the number of the Children of Israel.

Also in Psalm 109:6,

A magistrate type of satan appears in Zechariah 3:1-2, agreeing with the Israelites in their endeavours to re-establish their family stations in Jerusalem after the Babylonian captivity.

These are the only entries, - there are only four in the entire Old Testament – and in no instance is anything dark or sinister implied.

In the new Testament, only one reference introduces a devil character. Other Satanic entries are symbolic, eg, at the Last Supper it is stated “Then entered satan into Judas surnamed Iscariot, being of the number of the twelve”.

Elsewhere when the scribes admonished Jesus for performing exorcisms when he was not himself a priest: - “he called to him, and said unto them in parables, How can Satan cast out Satan?”

A few other references in the Acts and the Epistles are of a similarly obscure nature.

Revelation refers to blasphemers as being of the synagogue of Satan, while claiming that, having been dismissed from heaven, satan would be imprisoned for 1,000 years.

In Matthew 16:23, Jesus rebukes Peter for being a satan for advising Jesus against becoming too complacent. Jesus says, “Get thee behind me satan: thou art an offence to me, for thou savourest not the things that be of God, but those that be of men.”

The best known satanic reference in the New Testament is in Matthew 4:5-11, where “Satan took Jesus up to an exceedingly high mountain, and showeth him all the kingdoms of the world, and the glory of them: and saith onto him, All these things I will give thee if thou will fall down and worship me” Jesus declined the offer, whereupon “the devil leaveth him and, behold, angels came and ministered onto him”

Whoever the tempter may have been, (real or symbolic) he is not presented as being in any way persuasive or influential, and bears no similarity to the terrible demon of satanic mythology.

There is nothing fearsome in any of the biblical portrayals of satan(s), not the vaguest reference to a physical form. So from where did the diabolical horned satan of the fire and brimstone preachers come from?

In the first century, and thereafter, many sects, creeds, in the Roman Empire were vying for authority. This frequently exploded into violence. Finally in 325 AD, Roman Emperor Constantine 1, at a meeting at Nicea in Turkey, convened by him, of some 300 Christian Bishops, declared that Christianity would become the state religion of the Roman Empire. Jesus was therefore declared to be a God, 325 years later, by the Roman Emperor, and the result of these deliberations is the Nicean or Nicene Creed, so called after the location of the council, that is recited to this day in most Anglican, and Catholic churches.

The creation of a common religion for the Roman Empire, which we call Christianity today, was undertaken to restore control to an empire that faced internal revolt and civil war, an Empire that in any event was dying in the western regions. Once constituted as the “State Religion”, Christians became no less resolute in their suppression of other religions as had their oppressors been shortly before the Nicea meeting, and squabbling between differing Christian sects continued for many years, mostly over irrationally technical points.

The early Catholic faith was based on the subjugation of the masses to the dominion of the Bishops, and to facilitate this an Antichrist, (anti catholic) figure was necessary as a perceived enemy. This enemy was said to be Satan, the “evil one” who would claim the souls of any who did not offer absolute obedience to the church. Authority was then established on the back of a statement made by St Paul in the New Testament Epistle to the Romans (13:1-2) :

“Let every soul be subject unto the higher powers. For there is no power but of God; the powers that be are ordained of God. Whosoever therefore resisteth the power, resisteth the ordinance of God; and they that resist shall receive to themselves damnation”.

The Catholic Church thus became the self nominated bridge between God and the people. This was done by granting a vicarious office to the Pope, who became designated “Vicar of Christ”.

For this scheme of threat and trepidation to succeed, it became necessary to promote the notion that this diabolical Satan had existed from the beginning of time, and there was no earlier story with which he could be associated with than the story of Adam and Eve. Genesis, however made no mention of Satan, but there was the account of Eve and the Serpent. It was therefore necessary to rewrite the original text, which was a Jewish text, and Christianity had long since departed from Judaism, even the Liberal Judaism taught by Jesus.

No comprehensive translation of the Bible existed at that time. The Jews had their Hebrew, Aramaic, and Greek versions of the Old Testament, while the primary Christian Bible (the Vulgate) existed in an obscure form of Church Latin, translated from Greek by St Jerome in the 4th century.

Along side the Roman Church of the West, there were Eastern Christian branches in Syria and Ethiopia, and it was from these areas that the New Genesis accounts emerged.

An Ethiopic work, titled “The Book of Adam and Eve”, subtitled “The Conflict of Adam and Eve with Satan” was produced around the 6th century AD.

From Syria came “The Book of the Cave of Treasures”. It was originally compiled in the 4th century, but surviving copies date from the 6th century AD. This book has Satan as the constant protagonist of evil. In one section it has Adam and Eve dwelling in a cave, where they are visited 14 times by Satan, but each time he is put to flight by an Angel of God. This book further states that Christianity was in existence before the time of Adam and Eve, and the emergent Hebrews.

The biblical satan had no physical description, and early artwork portrayed him no differently than any other angel. In the year 591 AD, Pope Gregory 1 described satan as having “horns and hooves, and powers to control the weather.” Thus horned animals (stags and goats) were considered to be devilish and imagery became gradually more exaggerated with a tail, and other add-ons based on the satyrs of Greek mythology.

The “Book of the Bee”, a Nestorian Syriac text from 1222 AD, by Bishop Shelemon of Basra follows the same lines as “The Book of the Cave of Treasures”

The New Testament carries no such character as the Antichrist. The word is only used as a general, descriptive term. 1 John 2:18 says “even now there are many antichrists”, and in 1 John 2:22 continues “he that denieth that Jesus is the Christ; he is antichrist.”

The satanic myth is just that, a myth, concocted long after biblical times to undermine historical record and intimidate Christians into compliance with the dogmatic and subjective rule of the Bishops.

The entirety of this deception was thus nothing more than the preservation of a wealthy hierarchy by deceit in order to control the fearful masses.

The Scene is set.

Don Genaro, and Don Juan are accomplished Sorcerers.

Pablito is apprenticed to Genaro, Carlos is apprenticed to Juan.

The apprentices are being taught The Way of the Warrior as the way forward in their quest to be a Man of Knowledge, to find Freedom, and to become, themselves, what is called a Sorcerer. (Although by western definitions that would be incorrect, they would more correctly be called, “A Man of Power”, although the qualification is not limited to the male gender.)

Both Pablito and Carlos are “part qualified” (in western terms), having already spent many years under tutelage from their respective guides.

This is the narrative of Carlos, who is being instructed together with Pablito. The time is almost darkness just before dawn, the location is the Mexican desert, on a high plateau.

Following an outburst of laughter...........

“That's a better mood” Don Juan said, “And now, before Genaro and I say good bye to you, you two may say anything you please. It may be the last time you utter a word, ever”.

Pablito shook his head in the negative, but I had something to say. I wanted to express my admiration, my awe for the exquisite temper of Don Juan, and Don Genaro's warrior spirit. But I became entangled in my words and ended up saying nothing; even worse yet, I ended up sounding as if I were complaining again.

Don Juan shook his head and smacked his lips in mock disapproval. I laughed involuntarily; it did not matter, however, that I had flubbed my chance to tell them of my admiration. A very intriguing sensation began to take possession of me. I had a sense of exhilaration and joy, an exquisite freedom that made me laugh. I told Don Juan and Don Genaro that I did not give a fig about the outcome of my encounter with the “Unknown”, that I was happy and complete, and that whether I lived or died was of no importance to me at that moment.

Don Juan and Don Genaro seemed to enjoy my assertions even more than I did. Don Juan slapped his thigh and laughed. Don Genaro threw his hat on the floor and yelled as if he were riding a wild horse.

“We have enjoyed ourselves and laughed while waiting, just as the witness recommended” Don Genaro said all of a sudden.”But it is the natural condition of order that it should always come to an end”.

He looked at the sky.

“It is almost time for us to disband like the warriors in the story,” he said. “But before we go our separate ways I must tell you two one last thing. I am going to disclose to you a warriors secret, Perhaps you can call it a Warriors Predilection.”

He addressed me in particular and said that once I had told him that the life of a warrior was cold and lonely and devoid of feelings. He even added that at that precise moment I was convinced that it was so.

“The life of a warrior cannot possibly be cold and lonely and without feeling,” he said, “because it is based on his affection, his devotion, his dedication to his beloved. And who you may ask, is his beloved? I will show you now.”

“Don Genaro stood up and walked slowly to a perfectly flat area right in front of us, about ten or twelve feet away. He made a strange gesture there. He moved his hands as if he were sweeping dust from his chest and stomach. Then an odd thing happened. A flash of an almost imperceptible light went through him : it came from the ground and seemed to kindle his entire body. He did a sort of backward pirouette, a backward dive, more properly speaking and landed on his chest and arms. His movement had been executed with such precision and skill that he seemed to a weightless being, a worm like creature that had turned on itself. When he was on the ground he performed a series of unearthly movements. He glided just a few inches above the ground, and rolled on it as if he were lying on ball bearings; or he swam on it, describing circles and turning with the swiftness and agility of an eel swimming in the ocean.

My eyes began to cross at one moment and then without any transition I was watching a ball of luminosity sliding back and forth on something that appeared to be the floor of an ice skating rink with a thousand lights shining on it.

The sight was sublime. Then the ball of fire came to a rest. A voice shook me and dispelled my attention. It was Don Juan talking. I could not understand at first what he was saying. I looked again at the ball of fire; I could distinguish only Don Genaro lying on the ground with his arms and legs spread out.

Don Juan's voice was very clear. It seemed to trigger something in me and I began to write.

“Genaro's love is the world” he said. “He was just now embracing this enormous earth, but since he's so little, all he can do is swim in it. But the earth knows that Genaro loves it and it bestows on him his care. That's why Genaro's life is filled to the brim and his state, wherever he'll be, will be plentiful. Genaro roams on the paths of his love and, where ever he is, he is complete”.

Don Juan squatted in front of us. He caressed the ground gently.

“This is the predilection of two warriors”, he said. “This earth, this world. For a warrior there can be no greater love”.

Don Genaro stood up and squatted next to Don Juan for a moment while both of them peered fixedly at us. Then they sat in unison, cross-legged.

“Only if one loves this earth with unbending passion can one release ones sadness”. Don Juan said. “A warrior is always joyful because his love is always unalterable and his beloved, the earth, embraces him and bestows upon him inconceivable gifts. The sadness belongs only to those who hate the very thing that gives shelter to their being”.

Don Juan again caressed the ground with tenderness.

“This lovely being, which is alive to its last recesses and understands every feeling, soothed me, it cured me of my pains, and finally, when I had fully understood my love for it, it taught me freedom”.

He paused. The silence around us was frightening. The wind hissed softly, then I heard the distant barking of a lone dog.

“Listen to that barking,” Don Juan went on, “That is the way my beloved earth is helping me now to bring this last point to you. That barking is the saddest thing one can hear.”

We were quiet for a moment. The barking of that lone dog was so sad and the stillness around us so intense that I experienced a numbing anguish. It made me think of my own life, my sadness, my not knowing where to go, what to do.

“That dog's barking is the nocturnal voice of man”, Don Juan said. “It comes from in that valley towards the south. A man is shouting through his dog, since they are companion slaves for life, his sadness, his boredom. He's begging his death to come and release him from the dull and dreary chains of his life”.

Don Juan's words had caught a most disturbing line in me. I thought he was speaking directly to me.

“That barking, and the loneliness it creates, speaks of the feelings of men,” he went on, “Men for whom an entire life was like one Sunday afternoon, an afternoon which was not entirely miserable, but rather hot and dull and uncomfortable. They sweated and fussed a great deal. They didn't know where to go or what to do. That afternoon left them only with the memory of petty annoyances and tedium, and then suddenly it was over. It was already night.”

He recounted a story I had once told him about a 72 year old man who complained that his life had been so short that it seemed to him that it was only the day before when he was only a boy. The man had said to me, “I remember the pyjamas I used to wear when I was 10 years old it seems that only one day has passed. Where did the time go?”

“The antidote that kills that poison is here” Don Juan said, caressing the ground. “The Sorcerers Explanation cannot at all liberate the spirit. Look at you two. You have gotten the Sorcerers Explanation, but it doesn't make any difference that you know it. You're more alone than ever, because without an unwavering love for the being that gives you shelter, aloneness is loneliness.”

“Only the love for this splendorous being can give freedom to a warriors spirit; and freedom is joy, efficiency, and abandon in the face of any odds. That is the last lesson. It is always left for the very last moment, for the moment of ultimate solitude, when a man faces his death and his aloneness. Only then does it make sense.”

Don Juan and Don Genaro stood up and stretched their arms and arched their backs as if sitting had made their bodies stiff. My heart began to pound fast. They made Pablito and I stand up.

“The twilight is the crack between the worlds.” Don Juan said. “It is the door to the unknown”

He pointed with a sweeping movement of his hand to the mesa where we were standing.

“This is the plateau in front of that door.”

He pointed to the northern edge of the mesa.

“There is the door. Beyond, there is an abyss, and beyond that abyss is the unknown.”

Don Juan and Don Genaro then turned to Pablito and said goodbye to him. Pablito's eyes were dilated and fixed; tears were rolling down his cheeks.

I heard Don Genaro's voice saying goodbye to me, but I did not hear Don Juan's.

Don Juan and Don Genaro moved towards Pablito and whispered briefly in his ears. Then they came to me. But before they had whispered anything I already had that peculiar feeling of being split.

“We will now be like dust on the road,” Don Genaro said. “Perhaps it will get in your eyes again, someday”.

Don Juan and Don Genaro stepped back and seemed to merge with the darkness.

Pablito held my forearm and we said goodbye to each other. Then a strange urge, a force, made me run with him to the northern edge of the mesa. I felt his arm holding me as we jumped.

Then I was alone.


Be aware :-

Both Carlos and Pablito became as dust on the road. They met again later with Don Juan and Don Genaro who were also as dust, to continue their apprenticeship.

They had both Stopped the World on that twilight morning. Juan and Genaro, in peering, or “Seeing” had known they would. Their “induced abandon” and “warrior spirit” had triumphed. Part of the essence of the “warrior spirit” is “Unbending Intent”.

Neither had yet met a Worthy Opponent. They were unaware of the concept.

Neither had met, nor engaged with The Nagual. They were unaware of the concept, although at times Carlos could be induced to engage the “Second Attention”.

The lesson that night would represent far less than 0.1% of their teachings.

Don't dare make presumptions!


Many separate thoughts to consider.

Have fun.

Comments please.

Brevity will be rewarded. Verbosity disqualifies you. 1500 words max. More disqualifies you. Presumptions disqualify you.

(Big Smile)

Tuesday, April 28, 2009

[francis cabrel] il faudra leur dire

Cette chanson me rappelle les bons moments que j'ai passé avec une chère amie à moi, en France.


[charles martel] saint or sinner

A Frankish barbarian of the eastern Frankish kingdom of Austrasia, Charles Martel (688 - 741 A.D.) was most famous for the Battle of Tours (732 A.D.), near Poitiers, in which he successfully defeated the Saracen Moors in their invasion of France, thus preserving Christian Europe from the encroachment of Islam.

He held the title of Mayor of the Palace of Austrasia, but in actuality wielded the power of a king. His byname, "Martel," meant "hammer" and was used to describe the way he indefatigably drove back the Moorish invasion.

Barbarian or the saviour of civilization?

[education destroyed] one of the root causes

Would you please go and read this after you've finished ?

Proper education of the individual is available, in our society, to the movers and shakers that is, to the cogniscenti who know where to find the best for their children and thereby groom the leaders of the future. As for the plebs, they are trapped in a time warp and a philosophical mish-mash of unsupported educational doctrine which has gripped the west for two generations now, its results able to be observed in the state of our children today.

The following article explains why, historically, we are caught in this endless loop in the public sector and why it will not be changing any time soon. Although it discusses the American situation and is dated 1998, nevertheless it is applicable throughout the west and is as current as when it was written.

Stone, J. E. & Clements, A. (1998), Research and innovation: Let the buyer beware, in Robert R. Spillane & Paul Regnier (Eds.), The superintendent of the future (pp.59-97), Gaithersburg, MD: Aspen Publishers, via J. E. Stone and Andrea Clements, East Tennessee State University

Myron Lieberman (1993) estimates the dollar value of the manpower dedicated to educational research by professors and doctoral students alone to be in excess of $700 million annually. Still other education research is authored by state departments of education, by nonprofit "think tanks," by federal agencies, and by the regional educational research laboratories. Significantly, only a small percentage of published research is undertaken by schools or school systems.

The results of this scholarly activity are readily available to schools through a variety of sources. Thousands of books, professional and academic journals, newsletters, technical bulletins, and other published sources make research available to teachers and administrators, many recent publications available on the Internet.

A vast amount of material is indexed in the federally sponsored Education Resources Information Center (ERIC). ERIC includes a Current Index to Journals in Education and a microfiche library of mostly unpublished research called Research in Education.

Research in Education is available in education libraries throughout the United States. The amount of research available through these several sources is staggering, and most of it is directly or indirectly related to the problem of improving school achievement.

The idea of improving teaching through the application of science has been around since the earliest days of organized teacher training. John Dewey, for example, believed that the scientific study of child development would improve classroom instruction by suggesting ways in which teaching might be fitted to the learner (Dewey, 1916/1963). However, it was not until the 1960s that governmentally funded research began expanding to present-day levels.

The Johnson administration's "war on poverty" infused federal dollars into university research institutes and education laboratories on an unprecedented scale. Head Start (U.S. Department of Health and Human Services, 1985) and Follow Through (Proper & St. Pierre, 1980) are prime examples. Both were designed to improve the school success of disadvantaged children and they are among the largest educational research projects ever mounted. The Follow Through project alone cost nearly $1 billion.

Has the money and manpower spent on research been justified by improvements in schooling? If the findings reported in Education Week's "Quality Counts" (Wolk, 1997) are any indication, the answer would have to be no. Despite the pressures for improvement created by reports such as the National Commission on Excellence in Education's A Nation at Risk (1983), measured achievement has stayed essentially flat. The National Assessment of Educational Progress scores in math and science have risen only a few points on a 500-point scale since 1973 (U.S. Department of Education, 1996). Of course there are isolated examples of significant improvement, but the broad picture is that the schools are (in the words of "Quality Counts") "treading water."

Why so little impact?

If there is a significant amount of research--although arguably not enough--and the findings are widely available, why is there not at least a trend toward improved achievement?

Again, researchers have an [excuse]: Good research is available but schools fail to implement it. In other words, schools talk as though they adopt research-based innovations but at the classroom level they keep doing the same old thing (Cuban, 1993). There is more than a little truth to this claim. The innovative programs publicized by school administrators are not always translated into classroom practice. Teachers have a great deal of independence in the classroom and they are taught to fit their teaching style to students' needs. Remaining with accustomed approaches is, indeed, the tendency if only for reasons of comfort and familiarity.

Another explanation offered by researchers is that schools don't know good research when they see it. They are easily drawn to familiar practices supported by weak evidence. Unfamiliar practices supported by very credible evidence are often ignored. As discussed below, there is merit to this view.

[D]uring the 1960s and 1970s, correlational studies suggesting self-esteem enhancement as a means to improved achievement led to sweeping changes in teacher training and schooling but Experimental findings to the contrary were ignored (Scheirer & Kraut, 1979).

The [scientific findings] showed that self-esteem and achievement are correlated mainly because achievement enhances self-esteem, not because self-esteem enhances achievement.

One other explanation popular with researchers is the institutional inertia warps and retards progress. Plainly this view also has merit. All organizations encourage some possibilities and restrict others. All are comfortable with certain ways of conducting themselves and uncomfortable with others. Teacher unions, for example, may resist changes that make teachers' jobs more laborious. Administrative customs may resist change that make jobs look too easy. Of course, community expectations, regulatory policy, and public oversight can all exert resistance to change.

In marked contrast to the views of researchers, schoolhouse "insiders" (i.e., teachers and administrators) say that research has little impact because much of it does not work in the real world. As they see it, schools are doing everything they can to implement the latest findings, but social and economic realities impose limits.

Implementing research is like rebuilding a ship in the midst of a voyage. Staying afloat has to be the first consideration. Rebuilding during a storm is even more problematic. Schools can and do make the changes suggested by research, but circumstances can trump even the best-laid plans. Even with successful implementations, effects are obscured or nullified by factors such as limited resources, two-earner families, increased crime, teen pregnancy, drug abuse, gangs, television, and a host of other hindrances and adversities (Olson, 1997).

Despite the often limited benefit of research-based innovations, schools continue to adopt them--if only to keep up with the latest trends.

Which research and which innovations, however, often depends less on the quality of the findings than on the channel through which the research comes to the school's attention. School personnel are frequently exposed to "the latest" research at workshops, professional meetings, and in-service training. Typically, the teachers, administrators, and board members who attend these meetings have a limited understanding of research and/or of the findings pertaining to the innovation in question. More often than not, presenters and programs for such meetings are selected not because their ideas are well grounded but because they have a stimulating presentation. In addition, audience interest is often spurred by a regulatory mandate or incentive funding, not a burning desire for improved student achievement.

Other pragmatic considerations play a role as well. For example, attractiveness to students, teachers, parents, and other school system stakeholders can weigh heavily in research selections. So can public relations. The desire of school leaders and board members to demonstrate "progressive leadership" often plays a contributory role. In short, the selection of research-based programs and innovations brought back from workshops and meetings may be substantially influenced by considerations other than evidence of effectiveness.

The Restrictions Imposed by Doctrine

The practice of injecting popular psychological theory into schooling--often without regard to effectiveness or applicability--has been a chronic problem in American education (Davis, 1943; Hilgard, 1939). Currently, a poorly recognized but longstanding educational doctrine called "developmentalism" (Hirsch, 1996; Stone, 1996) permeates the public schooling community. Developmentalism frames teaching and learning issues in a way that favors certain types of research and disregards others.

Developmentalism is a derivation of eighteenth-century romantic naturalism. The French philosopher Jean Jacques Rousseau (1712-1778) is the most influential of its early proponents. The works of John Dewey (1859-1952) and Jean Piaget (1896-1980), however, are more directly responsible for its present-day acceptance. Developmentalism is a view of age-related social, emotional, and cognitive change that presumes a child's native tendencies to be a fragile expression of the individual's natural and therefore optimal developmental trajectory (Stone, 1996). It conceives of education as a set of experiences that serves to facilitate and preserve that trajectory by fitting the educational experience to the individual.

Developmentalism contrasts sharply with the classic tradition in education and with the American tradition founded by the Puritans. Both sought to civilize and better the individual, not merely accommodate his or her emerging tendencies. Both classic tradition and the common school aimed to discipline natural impulses in service of a higher good.

The significance of this philosophic issue as an impediment to effective schooling would be difficult to overstate. Most public schools seek achievement to the extent permitted by students' natural inclinations. They are "learner centered."

Most parents and policy makers want schooling that impels achievement beyond that to which most students are inclined by their youthful proclivities (Steinburg, 1996). They are "learning centered."

[Hence, in the UK, the rise and hegemony of the National Curriculum].

The dominance of learner-centered pedagogy is in no small part an accident of history. Progressivism--a social and philosophical offshoot of romantic naturalism--predominated in American intellectual circles in the late nineteenth century and early twentieth century. These were the years during which universal public education came to be public policy as well as the formative years of many teacher-training institutions.

Accepted teaching practices of that day were often harsh and punitive; thus progressive methods were a welcome alternative. The premier teacher-training institution of the early twentieth century was Teacher's College, Columbia University (Cremin, 1964). Its graduates led the development of other such programs around the country. Even today, the educational methodologies that prevail in the public education community are those that agree with the philosophic leanings of the Teacher's College faculty of the early 1900s (Hirsch, 1996).

Developmentally informed pedagogy has come to dominate public schooling but without clear public recognition of its nature and its role. Over the past 75 years it has emerged and reemerged under a variety of names. In the 1920s it was called "progressive" and "child centered." Today it is termed "reflective" and "learner centered" (Darling-Hammond, Griffin, & Wise, 1992).

However termed, it has consistently maintained that teachers should seek to instruct only through activities that students find engaging and enjoyable. Thus, instead of employing the most enjoyable of teaching methods that are known to result in learning, teachers have been trained first to seek activities that are enjoyable and engaging and to use them in ways that will produce learning. “Good” teaching has come to be thought of as teaching that is well received and that, incidentally, produces some degree of learning.

Uncertainty about learning outcomes was not considered a pedagogic weakness by progressive education's founders. Neither John Dewey nor progressive education's great popularizer, William Heard Kilpatrick, considered conventionally prescribed educational objectives to be the proper aim of schooling.

Instead, both argued that schooling should seek the emergence of an individually defined and broadly conceived intellectual development.

Dewey, in particular, wrote at length about the harm done by teacher insistence on externally defined aims (Dewey, 1916/1963). Viewed from the progressive/learner-centered perspective, research that seeks to demonstrate a teaching methodology's ability to produce a preconceived learning outcome is inherently faulty and inconsistent with the proper aims of schooling.

Despite public repudiation in the 1950s, Dewey's view remains the foundation of today's cutting-edge innovations. It has spawned a remarkable array of educational terms and concepts, and they have been widely propagated by agencies and organizations such as the U.S. Office of Education, the state departments of education, teacher-training programs, accrediting agencies, professional and academic societies, and the like.

The education community seeks to improve schooling through the use of research, but learner-centered strictures guide the adoption process. The impression created by the vast assortment of current educational terms and concepts is one of abundant variety. In truth, however, most conform to the same progressive vision of education.

As noted by E. D. Hirsch (1996), "within the educational community, there is currently no thinkable alternative" (italics in the original, p. 69). Recent permutations and derivatives include the following:

• lifelong learning • developmentally appropriate instruction • brain-based learning • situated learning • cooperative learning • multiple intelligences • multiaged instruction • discovery learning • portfolio assessment • constructivism • hands-on learning • project method • thematic learning • integrated curriculum • higher-order learning • authentic assessment • whole-language reading

How Learner-Centered Thinking Restricts Choices: The Case of the Follow Through Project

Learner-centered doctrine discourages the use of results-oriented research (Stone, 1996). Studies concerned with improving achievement typically test an intervention or treatment (i.e., an action taken by the researcher that is intended to produce change in the student). The success of the intervention is judged in reference to some predetermined expectation.

In contrast to the goal of inducing results, the goal of developmentally informed research is to accommodate schooling to the individual and to do so in a way that achieves the ends to which the individual is inclined by nature, not those prescribed by the curriculum.

One of the clearest instances of results-oriented research rejected on learner-centered grounds comes from the Follow Through project (Proper & St. Pierre, 1980). Follow Through was a huge federally funded research project of the late 1960s and early 1970s. It was launched in 1967 by the Ninetieth Congress in response to President Johnson's request to "follow through" on project Head Start. Improved achievement in the basic skills of disadvantaged students was its prime objective. It remains the largest educational experiment ever.

Nine educational models were compared in 51 school districts over a six-year period. Of the nine, all but two were learner centered; and contrary to the prevailing educational wisdom, the two exceptions significantly outperformed the field.

Of greater significance, five of the seven learner-centered models produced worse results than the traditional school programs (i.e., the nontreated control groups) to which each Follow Through approach was compared.

What makes the contrast especially striking is that the outcome measures included not only basic skills but "higher-order" cognitive skills and a measure of self esteem--the very sort of outcomes that learner-centered methods are intended to enhance.

The most successful of the nine models was Direct Instruction (Engelmann, Becker, Carnine, & Gersten, 1988), a structured and so-called teacher-centered approach. Despite its overwhelming success, Direct Instruction was disparaged and largely ignored by the education community (Watkins, 1988).

A lengthy critique of Follow Through was published in Harvard Educational Review (House, Glass, McLean, & Walker, 1978), and the U.S. Department of Education's National Diffusion Network--a bureaucratic agency responsible for disseminating only the "best" research--concluded that all nine programs were valid and all were recommended for further funding.

In fact, added funding was given to the failed models on the grounds that they needed strengthening.

The Follow Through Direct Instruction findings are by no means the only research that has been ignored because it disagreed with the learner-centered view. Herbert Walberg (1990, 1992) summarized some 8,000 reports of demonstrably effective teaching methods. Like Direct Instruction, most were structured, teacher-directed, and designed to produce measurable gains in achievement.

Most could be described as learning-centered instead of learner-centered. Many employed drill, recitation, and incentives for student effort. A review of research literature by Ellson (1986) found 75 studies of teaching methods that produced achievement gains at least twice as great as those of comparison groups. Many of them were popular at one time but none are learner-centered and none are in widespread use today.

The reception accorded Direct Instruction and other learning-centered research is important because it highlights a critical difference between the public's educational objectives and those of the learner-centered schooling establishment. Public Agenda (Johnson & Immerwahr, 1994) and other public polling organizations have found that the public wants schools that produce conventionally measured academic achievement.

The public is not opposed to the goals of learner-centered schooling, but it considers them secondary to conventional academic achievement. To the public, outcomes such as improved self-esteem are attractive, but schools that fail with respect to academic achievement are nonsense no matter what else they may produce. The same priorities are embodied in state-level school accountability policies.

They focus primarily on academic gains operationally defined by achievement tests. By contrast, learner-centered research gives equal priority to "intellectual growth," enhanced self-esteem, and gains in knowledge and skills. If one or more of the three are produced, the research is taken to be informative and potentially valuable for school implementation.

Why Researchers Remain Learner-Centered

Despite the ever-growing demand for improved achievement, neither researchers nor schools are able to break away from learner-centered thinking, and for several reasons. Both researchers and most school personnel are indoctrinated in learner-centered thinking, and powerful incentives encourage them to remain loyal to that point of view.

For researchers, funding is a prime incentive. Fund allocations are almost inevitably influenced by other educators, and most of them subscribe to learner-centered orthodoxy. Funding affords a researcher time to work, and to have a reasonable chance at funding, one's proposal must appeal to the views of other educators.

For most researchers, funding is tied to institutional support. Most researchers are college faculty, and their primary responsibility is teaching. If a faculty member needs time to conduct a study, the institution must at a minimum relieve the individual from teaching.

Ordinarily it will hire someone to teach in his or her place. Research grants provide the funding for the substitute instructor. If the researcher's employer does not like a proposal, it may decide against released time. A proposal that appeals to the views of learner-centered administrators and colleagues is more likely to find support.

Grants also pay what are called "indirect costs" for the use of the institution's facilities and other forms of overhead. These are additional funds that may amount to 50% or more of a research project's direct costs for a substitute instructor, equipment, supplies, and so forth. The funds an institution receives for such costs are typically added to various administrative budgets, thus enabling substantial discretionary spending.

College administrators consider a faculty which generates big indirect cost contributions to be their most productive and deserving faculty. Grants are the key to a faculty member's career advancement at major institutions. Grants that are readily funded for big amounts (e.g., grants from state education agencies) are thus extremely attractive.

Second, there is the matter of publication. In order to advance their academic reputations, researchers must publish. Research that is not published is assumed to be of lesser quality, and rightly so. Research that is published in the most respected journals is stringently peer reviewed. Reviewers and editors do not rule out findings that are inconsistent with orthodoxy, but such reports inevitably receive much closer scrutiny and are thus less likely to be accepted. A record of successful publication also contributes mightily to a researcher's chance of acquiring more funding.

Third, there is the matter of acceptance in the schools. The learner-centered view is more attractive to researchers because it is more easily marketed to the schools. Public school administrators typically have been trained in learner-centered thinking, thus such research has an intuitive appeal.

That it may not produce intended results is a downside, but one that is frequently overlooked. School administrators are never fired or penalized because an innovative program fails. After all, how could an administrator be blamed for accepting the recommendations of scholar-experts who are supported by prestigious institutions.

Because success is defined more in terms of funding than outcomes, appeal to decision makers is more important than demonstrated effectiveness. One need only observe the indicators of organizational advancement that are trumpeted in the media to verify the truth of this conclusion. Media releases talk about money and organizational expansion, not increased student learning.

The learner-centered view is comfortable to other stakeholders as well. Its convenience and vague expectations are significant considerations to teachers. In the learner-centered view, teachers are responsible for affording a quality educational experience, not the production of measurable academic outcomes.

Learner-centered teachers consider outcomes to be governed by factors outside teacher control, thus the quality of teaching cannot be judged by results. Also, teachers find that learner-centered approaches are flexible and can be blended with existing practice without inconvenience and disruption. Factors of this sort make the task of adopting learner-centered practices simpler than, for example, implementing Direct Instruction--a methodology requiring more than the usual day or two of in-service training.

Learner-centered instruction also appeals to students. It seeks to accommodate them, not to shape them. By contrast, schooling that produces results typically requires a concerted student effort, and the time devoted to such an effort can infringe on more attractive pursuits (Steinberg, 1996).

It should be noted, however, that students' short-term satisfactions come at the cost of very substantial longer-term cost.

Lost educational opportunity may result in permanently impaired career prospects--a delayed cost that students are unable to anticipate. Lost opportunities also cost taxpayers both in failed human resource development and the cost of remediation. Schooling that permits students to waste their own time and taxpayer-funded educational opportunity is an enormous but largely overlooked public disservice.

Recognizing useful research

Research that can add to the efficiency and effectiveness of public schooling is available, but school personnel must be able to recognize it. Otherwise, there is a very substantial chance that they will be drawn into adopting one of the many fads that dominate the educational landscape. Recognizing credible, useful studies requires an understanding of certain basics of research.

Both medicine and education rely on a scientific knowledge base. Medicine, however, relies on relatively mature and exact sciences such as physics, chemistry, and biology, whereas education relies on the far less mature social and behavioral sciences.

These differences in quality of research and precision of measurement are reflected in the certainty and internal coherence of the knowledge base on which the two professions rely. Competing and contradictory findings are not uncommon in the behavioral sciences; thus the matter of determining which findings are credible, important, and applicable is a formidable challenge to the educational practitioner.

[JH - Many areas of science, e.g. climate change in recent years, have revealed the cracks in the omnipotence of science, so it is not only the behavioural sciences in question any more.]

Given facts open to selective use and interpretation, educators frequently rely on knowledge that is equivocal or that may be contradicted by other evidence. Recognizing this condition, Anderson, Reder, and Simon (1995) offer the following caution:

[N]ew "theories" of education are introduced into schools every day (without labeling them as experiments) on the basis of their philosophical or common sense plausibility but without genuine empirical support.

[Instead] we should make a larger place for responsible experimentation that draws on the available knowledge. It deserves at least as large a place as we now provide for faddish, unsystematic and unassessed informal "experiments" or educational "reforms."

We would advocate the creation of a "FEA" an analogy to the FDA which would require well designed clinical trials for every educational "drug" that is introduced into the market place. (p. 24) Another limit on sound educational research is the inherent variability in human behavior. People think, feel, act, cooperate or don't cooperate, and so forth. Unlike inanimate objects, their actions are influenced by a range of extraneous variables that limit the applicability of findings.

Behavioral sciences such as psychology have evolved standards that enable meaningful research despite these uncertainties. Unfortunately, many studies ignore them and consumers frequently fail to recognize the inevitable deficiencies and limitations. Thus it is not uncommon for educational administrators, grant writers, and program developers to stretch findings beyond their intended meaning or inadvertently to misrepresent results.

Part 2, to be posted later in the week, analyses research itself. This post was more concerned with the political aspects of the acceptance of research and of which type.

As an adjunct to this article, it is useful to consider these:

Catherine Barrett, former president of the National Education Association, wrote, on Feb. 10, 1973, that "dramatic changes in the way we will raise our children in the year 2000 are indicated, particularly in terms of schooling. We will need to recognize that the so-called 'basic skills,' which currently represent nearly the total effort in elementary schools, will be taught in one-quarter of the present school day. When this happens - and it's near - the teacher can rise to his true calling. More than a dispenser of information, the teacher will be a conveyor of values, a philosopher. We will be agents of change."

[JH – step forward, Common Purpose and the Mentoring programme.]

In the March/April, 1976, issue of "The Humanist," Paul Blanshard wrote: "I think the most important factor leading us to a secular society has been the educational factor. Our schools may not teach Johnny to read properly, but the fact that Johnny is in school until he is sixteen tends to lead toward the elimination of religious superstition. The average child now acquires a high school education, and this militates against Adam and Eve and all other myths of alleged history."

Today’s late Boomers’ through to Gen Y’s failure to grasp the necessity for a moral framework behind education is largely a result of this sort of thing which has been peddled for three generations now and is taken as read by most, despite it having no demonstrably sound foundation.

Here’s another example of the quite demonstrable corrosion of values in education, which the research condundrum leading this post is but one manifestation of:

Raymond English, Vice President of the Ethics and Public Policy Center, told the National Advisory Council on Educational Research and Improvement, on April 2, 1987, that "critical thinking means not only learning how to think for oneself, but it also means learning how to subvert the traditional values in your society. You're not thinking 'critically' if you're accepting the values that mommy and daddy taught you. That's not 'critical.' "

Critical thinking or analysis, a fine concept in itself, is here hijacked into rejecting even sound values imparted by parents and asks a child to differentiate, from the experience of ages, what to throw out and what to retain. Hence the wedge between teacher and parent, with the parent largely unaware of what values the teacher is imparting on a day to day basis. And guess which values they will be? Conforming to which all-pervasive world view?

During my teacher training, we were expected to acknowledge A.S.Neill's Summerhill as the summum bonum, to accept the flawed Piaget as gospel, to study the wisdom of Erich Fromm and to adopt ‘open plan’ as the most effective route to learning. Just as Spock was later shown to have been erroneous and actually apologized for it, so the damage had already been done and make no mistake – teachers sprouting this stuff still occupy major places in schools and colleges of further education, lurking around children’s minds, spreading this guff.

The sort of teacher we're talking about, the left liberal or out-and-out socialist, was trained on a diet of this seemingly humanitarian but dangerously wrong philosophy and can point to any amount of narcissistic "research" to support it [see article above]. It's hardly likely then, that after thirty years or so in education, he or she is ever going to admit being taken for a ride. It would be to his or her credit to do so and to start to undo the damage before retiring.

What should be put in its place? Well let’s start with the liberal arts [not liberal in the modern sense of the word] and we’ll go from there but as the liberal arts lead to a more rounded and educated person, so they are held in reserve for the grooming of the elite and the chance of most of the populace to enjoy this advantage is next to zero, particularly given the cynical socialistic stranglehold on public education, on admissions policies, on teacher training and selection – it’s a fait accompli and can be seen in the illiteracy and innumeracy of our children.

Now that it is too late and will take two generations to reverse the damage, people might just be starting to sheet home the blame where it belongs whilst the movers and shakers continue to provide their children with the best available.

This is what’s happening in the wider world of politics, educational policy being just one plank of the institutionalized socialist platform to destroy the fabric of society and keep the plebs under control.

Please read Tom Paine's take on this.

[merthyr tydfil] does it take the prize

I wonder how true this is:
I've just looked at a number of lists and the most consistently bad was Merthyr Tydfil. I can't see this, actually. The place seems to be a hotbed of discovery:

While testing a new angina treatment, researchers in Merthyr Tydfil discovered (purely by accident) that the new drug had erection-stimulating side effects. This discovery would go on to form the basis for Viagra.

Monday, April 27, 2009

[alaric] and the joy of visigothdom

Who can honestly say that at some time or other, he hasn't wanted to be a Visigoth? What a career move. Which one would you be - Alaric?

Alaric was the first barbarian to successfully capture the city Rome in 410 A.D. Although his troops spared most of the residents and the architecture (Alaric was a known lover of beauty and literature) they pretty well looted the place.

Interestingly enough, a vision of his some 15 years before had predicted that he would successfully capture Rome. After the capture, he traveled south with the intention of crossing over into Africa, but was hindered by storms along the Mediterranean coast.

His descendants, the Visigoths, migrated to the Iberian peninsula, and eventually became the Spaniards; an indication of their heritage lies in the fair hair and blue eyes of the Northern Spaniards. See also Stilicho below.

Now those guys really knew how to kick butt.

[alien abduction] and john e mack

BBC Magazine said this of Professor John E Mack:

[He] was an eminent Harvard psychiatrist, psychoanalyst and Pulitzer Prize winner whose clinical work focused on dreams, nightmares and adolescent suicide. In 1990, he turned the academic community upside down because he wanted to publish research in which he said that people who claimed they had been abducted by aliens, were not crazy at all. Their experiences were genuine.

Actually, what he said was: 'I have no way to account for them.'

Mack initially suspected that such persons were suffering from mental illness, but when no obvious pathologies were present in the persons he interviewed, Mack's interest was piqued.

As BBC mag quoted Mack:

"What are the other possibilities?" asked Mack. "I would never say, yes, there are aliens taking people. [But] I would say there is a compelling powerful phenomenon here that I can't account for in any other way. It seems to me that it invites a deeper, further inquiry."

One of the supposed abductees, Peter Faust said:

"Do I question my own sanity? Absolutely, every day, because the world says you're crazy for having these experiences. But if it was only me who had had intimate experience with female aliens, producing hybrid offspring, I would say I'm certifiable, put me away, I'm crazy. And that's how I felt when I initially had these experiences.

My wife thought I'd lost it. But then I began to look at the experience outside myself and realised that hundreds, if not thousands, of people reported that exact same experience. And that gave me sanity. That gave me hope. I knew I couldn't be fantasising about this."

Three things followed Mack's investigation of 200 claimed abductees:

1. He was sent a letter informing him that there was to be an inquiry into his research. It was the first time in Harvard's history that a tenured professor was subjected to such an investigation.
John Mack decided to fight back and hired a lawyer, Eric MacLeish. There followed 14 months of stressful and bitter negotiations.

Eventually Harvard dropped the case and a statement was issued reaffirming Mack's academic freedom to study what he wished and concluding that he "remains a member in good standing of the Harvard Faculty of Medicine".

2. He was killed by a car in 2004, in north London, shortly after leaving a Tube station. At the time he was visiting the city to deliver a lecture on his Pulitzer Prize research in 1977.

3. [He had] the support of Laurance Rockefeller, who also funded Mack's Center for four consecutive years at $250,000 per year.

Whatever one thinks about the alien issue, the problem Mack poses is that he was one of the most highly regarded tenured professors, his style was empirical and straightforward and his conclusions mildly stated.

His character can also be gleaned from one of his statements:

The extension of a new world view that derives from our experience of the interdependence and interconnectedness of all living things, together with a recognition of the fragility of the earth's ecosystems, will be an important step in the preservation of the planet.

But blowing the traditional Western mind is not enough. Leadership and action on behalf of life and the environment will be required. We will need to take risks and expose our vulnerabilities. Perhaps it has always been so, but I am struck by how many of the political and intellectual leaders I admire for their efforts on behalf of human life have spent time in prison.

Facing up to the established order, taking a stand with one's whole being, exposing one's vulnerability, and risking the loss of personal freedom all seem to inspire both leaders and their followers.

With a world view like that, no wonder his conclusions have sat uncomfortably with certain sections of society.

Friday, April 24, 2009

[barbarians] prepare thyselves, my friends

Herodotus of Greece wrote:

"Barbarians can neither think nor act rationally, theological controversies are Greek to them...Under the assault of their horrible songs the classic meter of the ancient poet goes to pieces...Barbarians are driven by evil spirits; "possessed by demons", who force them to commit the most terrible acts...incapable of living according to written laws and only reluctantly tolerating kings...

Their lust for gold is immense, their love of drink boundless. Barbarians are without restraint...Although generally they are considered good-looking, they are given to gross personal hygiene...They run dirty and barefoot, even in the winter...They grease their blond hair with butter and care not that it smells rancid...

Their reproductive energy is inexhaustible; the Northern climate of their native land, with its long winter nights favors their fantastic urge to procreate...If a barbarian people is driven back or destroyed, another already emerges from the marshes and forests of Germany...Indeed, there are no new barbarian peoples--descendents of the same tribes keep appearing."

(Herwig Wolfram, The History of the Goths, p. 6-7).

Thursday, April 23, 2009

[hr] and the mania for box ticking

There are certain occupations which are intrinsically unpopular.

Below was a typical summary of the prevailing attitude to HR around mid-2007, so I thought it might be time to revisit and see if it has improved any today.

1. They think you are a resource

Petroleum, water, lumber, and humans. One of these things is not like the others—unless, that is, you consult the HR department. They view you as a resource, and they are not shy about it.

2. Talking to them accomplishes nothing

While your meeting with the HR rep may be the closest you get to being heard, the fact of the matter is, he or she is probably someone who can’t change the landscape very much, if at all. The people who could do something about, the ultimate decision-makers, do not want to be bothered by a sea of personal stories.

3. No real understanding of you or your job

With a professed disinterest in the details of your job or your life and the complete lack of ability to do anything about either anyway, it’s not really surprising that the HR department makes little to no effort to really understand what’s going on in the trenches.

4. Inflexible policies and red tape

The policies of the HR department are designed to cover a ridiculously broad range of circumstances with one fell swoop. Making blanket statements about how much of a raise you can give someone, how quickly you can promote someone, and how to move an employee from one role to another laterally is just another step toward oversimplification and homogenization of human dynamics down into human resources.

5. They pretend to be on your team

“Our people are our key asset, everything we do is informed by our constant vision of teamwork and shared opportunity…” Well, it doesn’t take long to realize how far that is from the truth in most cases.

Jacquelyn Thorp Kinworthy, a professor at Cal State-Fullerton and CEO/founder of HR-Coach Products and Services found, in 2006:

1) Companies hire inexperienced and unqualified people to handle HR, but expect them to perform at higher levels than they are qualified.

2) Companies do not invest in HR as they do in other departments.

3) Many small to medium size companies have HR people that are strategic partners.

Comments included:

When looking for a top-notch program that would prepare me to be an HR leader, I found that there weren't many programs that were forward-looking. [Bob Filipczak]

Many HR people I know (and I am an HR person so I know a lot of them) have a very narrow perspective - they know HR but they do not know business. I believe HR people are better off with a business degree than an HR degree.

If HR can demonstrate and take ownership for the aggregate human capital investment of the business and show how the productivity and ROI of the investment can be improved...they'll have a lot of influence in the management of the company and be "at the table." [mahendrakumardash]

The majority of people in HR are so wrapped up in politics and diversity programs they have no interest or time for activities that add value to the company's bottom line. They are "policemen" and view employees with contempt … I follow the works of Jeff Pfeffer, Dr. John Sullivan and David Ulrich, but I see no evidence of their theories in practice in Canadian business. [Frank DiBernardino]

That was then, so has it improved? In the UK, with the Rise of the NVQ, requiring you to have a specialist qualification to even clean the floor or sweep the street, certain jobseekers I know have complained that HR is a closed club of box tickers.

This is understandable, as every guild in history has tried to guard its esoteric language and list of prerequisites for admission; in one of my own fields of work, education, Special Needs teachers are the most open manifestation of that little game. I don’t know about HR and can only go on what I read.

What this post needs is some input from the HR professionals themselves, putting us straight about the current state of play. Most people I know in the corporate world continue to undervalue this department, even seeing it as obstructionist and irrelevant, so it would be nice to read the other side of the story.