Total Pageviews

Saturday, December 18, 2010

Sentimental analysis

Have you ever thought an ipod might play as long as you love me baby when you are with your love and a summer of 69 when u are wth friends without you thinking for it
one of the recent developments have been coming through social and info media and media player giants winamp working on it.
Here are the excerpts
One of the developments we’re currently tracking is the manifestation of more tools for understanding mood and sentiment analysis. A number of services have been popping up around this idea – Littlecosm and Tweetfeel just to name a couple – of which the most notable are trying to passively gather information about mood, aggregate this sentiment in some way, and potentially provide a layer of analysis that could result in an interesting recommendation engine. We were notified by the Winamp team that they have begun to incorporate a similar type of recommendation system for their music platform now powered by Syntonetic’s Moodagent, so we took the opportunity to speak with Syntonetic CEO Peter Berg Steffensen about these ideas. Below Peter shares his insight on the thinking behind the development of Moodagent, and some brief thoughts on where sentiment analysis may be heading.

How do you see sentiment analysis + social recommendation changing over the next 3-5 years?

As the availability of music on device, in home entertainment and in-cloud nears completion, personalization will be key to entertainment fullfilment. Capturing the personal sentiment play-by-play will allow Moodagent to build engaging experiences by bridging in relevant media assets in novel ways (how about an Emotional Weather Forecast, as Tom Waits suggests?). This applies equally to music videos and other digital assets with an audio as a natural component. Expanding this to include group sentiments in social networks is a natural extension that can serve to “guide” or inspire, especially in the physical space … I´d expect the hardware and sensor manufacturers to match this with e.g. NFC heart/mood-rate detection.

What new ideas are emerging around sentiment tracking?

Together with the University of Glasgow and mobile hardware manufacturers, Moodagent is currently working on a couple of prototypes for sentiment tracking and control for both individuals and groups, but you´ll have to wait for the details until we get them out of the lab.

We currently have systems that center around things like pushing a ‘like’ button, rating systems, or tags to catalog mood. How can these things be more passive experiences?

Moodagent can map the individual history of use, and with that group use-history, we can put the experience on remote control for one and all (applying sensors where available, of course) … but we´d like to think that we can build such engaging products that our users will want to play along.

Thanks, Peter!

Winamp
Moodagent

neuroscience of prediction


When the eight legged paul or the tarrot parrot gave decision regarding the winning ways of world cup many were astounded many shocked and many just ignored the decision.
Well man has but designed suitable ways to realize through thoughts the mental images he has in mind.This is the neuroscience of prediction.The pragmatist will not agree or may agree an atheist will be alarmed and a theist might trust the science.
But herein comes the rational optimist who believes things we see are beyond our regualr senses but can be felt through computation and induction logic.
This is one article i came across
http://online.wsj.com/article/SB10001424052748704156304576003432223218682.html
I recently came across the phrase "remembering the future." Rather than some empty poetic paradox, it appeared in an article about a neuroscientific experiment that tested a hypothesis of Karl Friston of University College, London, that the brain is more active when it is surprised.

In the study, volunteers watched patterns of moving dots while having their brains scanned. Occasionally, a dot would appear out of step. Although there was the same number of dots, the visual part of the subjects' brains was more active when the dots broke step. According to Arjen Alink of the Max Planck Institute in Frankfurt, Germany, who did the experiment, the brains were predicting what would happen next and having to work harder when their predictions failed. They were "remembering the future."

There is a growing conviction within neuroscience that one of the human mind's chief preoccupations is prediction. Jeff Hawkins, the founder of Palm Computing who is now a full-time neuroscientist, argued in his 2004 book "On Intelligence" that the mind does this by detecting a familiar pattern in its input, then anticipating from past experience what usually follows. The more unexpected something is, the more conscious we are of it.

This explains a lot about awareness. When I push my foot down on the brake pedal, I expect to feel deceleration. If I do, I am barely conscious of the fact: My mind continues to concentrate on the radio or my conversation with my passenger. If I don't, I am immediately so aware of the car skidding on the ice or the brakes failing that my mind is fully occupied with the failed prediction.

The big brains of human beings undoubtedly lead them to predict patterns further ahead than other animals. My dog is quite capable of expecting to be taken for a walk or given her dinner at certain times of the day. But she is not capable, as I am, of expecting cold weather in winter or predicting the need to pack a suitcase before a trip. Still, she probably has a longer view of the future than a guinea pig, which in turn sees further ahead than a frog.

Some birds stand out as exceptionally good at "mental time travel." The psychologist Nicky Clayton observed that western scrub jays steal food left behind by lunching students at the University of California at Davis. The jays hid the food by digging it into the ground. Sometimes they came back later and moved the food—but only if they had been observed by other jays when hiding the food in the first place. Dr. Clayton has since shown in her lab at Cambridge University that they do this to foil thieves, and that scrub jays are uniquely forward-thinking in this respect, even compared with other food-caching species of bird.

Dr. Clayton's other experiments with children reveal that this mental time travel becomes possible for human beings around the age of five. As adults, we inhabit longer futures than children, and longer pasts, too.

Daniel Schacter of Harvard University has made the remarkable discovery that the same parts of the mind hold both our episodic memories and our imagined futures. That is to say, if asked to imagine some specific future event, people activate the very same regions of the brain as they do when asked to recall some particular past event. Indeed, people who suffer strokes that affect these regions lose not just the ability to remember their own lives but the ability to imagine future possibilities as well.

Dr. Schacter concludes, much like Dr. Hawkins and Dr. Friston, that "a crucial function of the brain is to use stored information to imagine, simulate and predict possible future events." Through technology like writing and printing, the longer we extend the past, the longer our view of the future becomes. But that is a subject for another column.

Monday, December 13, 2010

neurobics :keeping the brain alive


How do we excercise our brain it is a tough ask if we do not respond to the question how the brain survives.

The brain receives, organizes, and distributes information to guide our actions and also stores important information for future use. The problems we associate with getting older— forgetfulness, not feeling "sharp," or having difficulty learning
new things—involve the cerebral cortex and the hippocampus.
The cortex is the part of the brain that is responsible for our unique human abilities of memory, language, and abstract thought. The hippocampus coordinates incoming sensory information from the cortex and organizes it into memories. The
wiring of the cortex and hippocampus is designed to form links (or associations) between different sensory representations of the same object, event, or behavior.
Every cortical region sends and receives millions of impulses via these axons to and from dozens of other cortical regions. The brain contains literally hundreds
of miles of such wires. Thus, the cortex resembles an intricate web, VISUAL AREAS OF THE CORTEX with each region linked directly or indirectly to many other regions. Some of these connections are between areas that process similar information, such as the thirty involving vision, while other connections are between dissimilar
areas, such as touch and smell. The network of pathways between cortical regions that do many different things is what allows the cortex to be so adept at forming associations. Like the cortex, the hippocampus plays an important role in forming associations. The senses continually flood the brain with information, some of it vital but much of it unimportant. You don't need to remember the face of everyone you pass on the street, but you do want to recognize someone you just met at your boss's party! To prevent the information overload that would accompany having to remember too much, the hippocampus sifts through the barrage of incoming information
from the cortex and picks out what to store or discard. In other words, the hippocampus acts like a central clearinghouse, deciding what will be placed into long-term memory, and then, when called upon, retrieving it. The hippocampus's decision to store a memory is believed to hinge on two factors: whether the information has emotional significance, or whether it relates to something we already know.
Because each memory is represented in many different cortical areas, the stronger and richer the network of associations or representations you have built into your brain, the more your brain is protected from the loss of any one representation.These multisensory representations for tasks like remembering names were always available to you, but early on, your brain established an effective routine for meeting people that relied primarily on visual cues. An important part of the Neurobic strategy is to help you "see" in other ways—to use other
senses to increase the number and range of associations you make. The larger your "safety net," the better your chances of solving a problem or meeting a challenge because you simply have more pathways available to reach a conclusion.
The early process of neurobics emerged in the gurukools of vedic India.
The process was to alloy mans senses with nature and give him the ability to learn out of the infinite.
The chakravuha the vipassana meditation have all been explained like this as all repesent the matrix of the mind interconnection association and the rest.
So neurologics has its root node n the earliest vedic scripture.

Friday, December 10, 2010

Facebook:social revolution


the way social networking has reached peaks and crossed barriers it is un neccessary to say facebook is omnipresent only next to a creature named god.
Anybody in the virtual world cannot deny of the presence of this two entities google and facebook and the world seems uni polar to meet their needs
Facebook seems to be a bit of an omnipresent entity these days. Once the domain of college kids, it now features users of all ages and has also turned into a promotional tool for artists, politicians and businesses.
Vincenzo Cosenza has been tracking various social networks’ popularity throughout the world, and he just posted a December 2010 map — based on data by Alexa and Google Trends for Websites — showing just how prominent Facebook (Facebook) has become throughout the world.
A comparison between this new map and one from June 2009 reveals that Facebook — consistently the top social network in countries like the U.S., Canada and Australia (Australia) — is growing even more popular in the international sphere. Based on Cosenza’s findings, it looks like Facebook has replaced Orkut (Orkut), once in the top spot, as the number one social network in India. And the site has also gained ground in South America and Europe.

There are some instances where other social networks have maintained their popularity; Russia is still dominated by V. Kontakte, and Orkut maintains a strong foothold in Brazil. Meanwhile, QZone’s big in China.
According to Consenza’s findings, other social networks, while not in the top spot, are also starting to gain some traction in other countries. LinkedIn (LinkedIn) is in third place in Australia and Canada, and Twitter (Twitter) holds the same ranking in Germany and Italy.



World Map of Social Networking, Dec. 2010

So its high time we feel the breeze of the future rays of social media and understand what is to come.
I have tried to sketch what social media will give us in the next decade.
Media will be more responsible but the users may be errant in their need of the web.
The cloud database may turn a boon for the informer but will be a tough ask for the media observer.

. Privacy expectations will (have to) change
There will be a cultural shift, whereby people will begin to find it increasingly more acceptable to expose more and more of their personal details on different forms of social media. Sharing your likes, dislikes, opinions, photos, videos and other forms of personal information will be the norm and people will become more accepting of personalized experiences, both corporate and personal, that are reacting to this dearth of personal information.There are a number of fake profiles in facebook claiming celebrities.This will be on a rise and fake papparazi may also be a modern entrant so the real person must be highly acceptable and should have the courage to let it go when the need comes.

2. Complete decentralization of social networks
The concept of a friend network will be a portable experience. You’ll find most digital experiences will be able to leverage the power of your social networks in a way that leverages your readily available personal information and the relationships you’ve established. We’re already seeing the beginnings of this with Facebook Connect and Google’s FriendConnect.

3. Our interaction with search engines will be different
Real-time information in Google search, e.g. from Twitter, blog results and user reviews, will be more prominent. Google’s Social Search will change the way we interact with search engines by pushing relevant content from our personal networks to the front of search results, making them more personalized. The importance of digital-influencer marketing will increase significantly.
Brand advertising will give way to brand equity and proper leveraging.

4. Rise of the content aggregators
The amount of content online is growing at an exponential rate, and most online users have at least three online profiles from social networks to micro-blogging to social news sites. Our ability to manage this influx is challenging, and content aggregators will be the new demi-gods, bringing method to madness (and make a killing). Filtering and managing content will be big business for those who can get it right and provide easy-to-use services.

5. Social media augmented reality
Openly accessible information from the social-media space will be used to enhance everyday experiences. For example: the contacts book in your phone links to Facebook and Twitter to show real-time updates on what the contact is doing before you put in the call, real-time reviews from friends and associates will appear in GPS-based mapping services as a standard feature, and socially enabled CRM will change the way companies manage business relationships forever.Information will be more context specific than language specific.

6. Influencer marketing will be redefined
As social media continues to permeate more and more aspects of not only the way we interact with digital media but also other channels such as digital outdoor, commerce or online TV, we will see the significance of influencer marketing grow dramatically. As a basic example, the inclusion of Twitter in Google search results or Google’s soon-to-be-released Social Search will permeate search results with content that will not be managed by Google’s infamous PageRank but by social influence and relevance to your social network. Discovering people that can help you to reach your desired consumer will become exponentially more effective and important.

7. Ratings everywhere
In today’s world, having a commerce site that doesn’t have user ratings could actually prove to be a detriment to sales. In the near future, brands and businesses will more frequently place user ratings and accept open feedback on their actual websites. User ratings will become so common that marketers should expect to find them woven into most digital experiences.Non linear techniques for evaluation of sites to identify trp will be coming through.

8. Social media agents
Managing the customer experience offline and online is already a key concern for marketers and customer-experience advocates. As businesses continue to support customers by monitoring and engaging in the social media space, tools to optimize this experience will become more important. Expect to see a certain percentage of responses handled by natural language engines that can respond to basic commentary such as “my service is down” or “I never received my package.”

9. Riding the (Google) wave
It’s still early days as Google Wave is still primarily limited to developers but it has the potential to revolutionize collaboration and engagement. Wave offers marketers a unique way, at minimal cost, to allow consumers to engage with each other in way that is miles beyond anything we’re currently using. Savvy marketers will develop extensions for Wave that evolve its unique communication toolset into a rich brand experience that is immersive but allows for new levels of interaction from crowdsourced storytelling to crowdsourced product design.

10. Thinking beyond “nowness”
In 2009 we became very focused on the real-time nature of social media. The implications behind consumer feedback and interaction around brands using tools like Twitter or Facebook’s news stream caused marketers to re-evaluate the power of social media tools in parallel to “traditional” digital-media channels such as search. Looking into the future we’ll need to try and evaluate what’s next and the likely answer is based on the next evolution of the web as we know it: the semantic web. In a semantic web world, search engines, for example, will anticipate the best search results we’re looking for based on what they know about us (such as all our public social networking profiles). There will be an opportunity for marketers who push the limits of their imagination to anticipate what marketing will look like in this next stage of the web and creating new and compelling experiences that we’re only touching the surface of now.Prediction analysis and data mining may be highly in demand

11. Social media everything and the return of digital media
Social functions will become so commonplace in digital experiences that the thought of not having socially-enhanced experiences will seem illogical. Digital media by its very nature is inherently social. I hope we’re not talking about social media in 2012, and we just refer to everything as digital media again.

12.Social networking will lead way to mobile networking:More easier track down information and language specific operation higher linguistic support.The social application will be more end to end support meaning there will be writing application in one language and giving answer in other language without bothering what they are writing is right.

13.Tracking down objects:Language extraction is possible that might even go down to object recognition and tracking

The 13 seems more than enough

Thursday, December 9, 2010

cognitive science and religion


What is Cognitive Science?
And What Does It Have to Do with Religion?


John F. Kihlstrom
University of California, Berkeley






Paper presented at a conference on "Religion and Cognitive Science: From conflict to connection", co-sponsored by the Graduate Theological Union and the University of California, Berkeley. Berkeley, California, January 17, 2008.



For those of you who are from out of town, or at least not denizens of either Berkeley or Holy Hill, let me welcome you again to this conference on religion and cognitive science. The undergraduate cognitive science program is pleased to be a co-sponsor of this conference. As director of that program, it falls to me to provide an introduction to the field of cognitive science, and to say some things about what it has to offer the study of religion -- and, for that matter, what the study of religion has to offer cognitive science. But I must point out that there are about 50 faculty members involved in cognitive science at Berkeley, and if you asked them all what cognitive science is all about, I suspect that you would get about 50 different answers (see also Bechtel & Graham, 1998; Boden, 2006; Gardner, 1985; Nadel, 2003; Osherson, 1995; Sobel, 2001; Stillings et al., 1995; Thagard, 2005; Wilson & Keil, 2001).

So here's mine.

Slide2.JPG (29218 bytes)Cognition is about knowledge and knowing, and cognitive science tries to understand the acquisition, representation, and use of knowledge by minds, brains, machines, and social entities. The topics of cognitive science are pretty much coterminous with the topics of cognitive psychology -- but with a difference in approach that I hope to make clear.


A Capsule History of Cognitive Science

Slide3.JPG (32781 bytes)The deep origins of cognitive science are in modern philosophy -- which, since Descartes, has been focused on problems of epistemology, as opposed to metaphysics, ethics, and other traditional areas. Think about the debate in the 18th century between the rationalists and the empiricists concerning the origins of knowledge. This was also a time of scientific revolution in both physics and biology, and one would have thought that psychology would have been part of that scene as well. But the legacy of Cartesian dualism, with its emphasis on the immaterial nature of mind, was to take psychology off the scientific table. As late as Kant, in the late 18th century, psychology was an impossible science: measurement was essential to science, but the mind, being an immaterial object, was not subject to measurement; therefore, so Kant reasoned, psychology could not be a science. In less than 50 years, though, Ernst Weber (in 1834) and Gustav Fechner (in 1860) had discovered the first psychophysical laws, quantifying the relationship between the intensity of physical stimulation and the intensity of the resulting sensory experience; Hermann von Helmholtz performed experiments to understand the mechanisms of distance and motion perception (1856-1866); and Franciscus Donders had introduced (in 1868) reaction time as a means of measuring the speed of mental processes (Boring, 1950; Wozniak, 1992).

Slide4.JPG (28557 bytes)Reflecting the British empiricists' emphasis on experience as the source of knowledge, early experimental psychology focused on phenomena of sensation and perception. Indeed, Wilhelm Wundt (1873-1874), generally regarded as the "father" of experimental psychology, argued that psychology as a true quantitative, experimental Naturwissenschaft was limited to the study of sensation and perception; all "higher" mental processes, including what Wundt called "cultural" psychology, were consigned to a nonexperimental Geisteswissenschaft (but see Greenwood, 2003). Still, as early as 1885 Ebbinghaus proved Wundt wrong by inventing methods for the quantitative, experimental study of memory; a little later, Mary Whiton Calkins (1896) developed paired-associate learning to study the formation of associations, and Pavlov and Thorndike (both 1898) developed similar methods for studying learning in animals; and finally, in 1920, Clark Hull put the icing on the cake by adapting Ebbinghaus' methods to the study of concept formation, a major aspect of thinking.

Slide5.JPG (32732 bytes)Unfortunately, just as psychology was ready to address problems in cognition at all levels, the dark days of behaviorism descended on the field. In a syllogism reminiscent of Kant's, John B. Watson (1913, 1920) and other behaviorists argued that science was based on objective, public observation, but mental life was inherently subjective and private; therefore, psychology -- if it were to be a true science -- had to banish the mental from its discourse. Along the same lines, Gilbert Ryle (1949) famously characterized the mind as "the ghost in the machine". William James's science of mental life quickly became B.F. Skinner's (1938, 1953) science of behavior. Psychology, as one wag put it, having lost its soul, now lost its mind as well (Woodworth, 1921, p. 2).

Slide6.JPG (36187 bytes)The hegemony of behaviorism within psychology lasted for more than half a century, but during and after World War II the development of high-speed computers, cybernetics, and information theory set the stage for what has since come to be known as the cognitive revolution in psychology (for details, see Baars, 1986; Boden, 2006; Gardner, 1985; Hirst & Miller, 1988).

Slide7.JPG (34309 bytes)Of particular importance was the year 1956, in which Jerome Bruner and his colleagues published A Study of Thinking -- the first experimental exploration of concept-formation since Hull. For two months that summer, psychologists, computer scientists, and others met at Dartmouth to consider the prospects for artificial intelligence. And at a symposium that fall at MIT, Newell and Simon described their computer simulation of problem-solving; Noam Chomsky presented a new way of viewing language, with an emphasis on syntactical rules; and George Miller discussed the limitations of human information processing. All three of these papers were delivered on the same day: September 11, 1956, which George Miller has since characterized as the birthdate of cognitive science.

Slide8.JPG (40308 bytes)This is the birthdate of cognitive science, not of cognitive psychology, because in 1956, and for many years thereafter, the hegemony of behaviorism was such that there was no institutional home for cognitivism in American academic psychology. Bruner, Miller, and their colleagues, for example, had to leave the confines of Harvard's psychology department, in Memorial Hall (actually, Bruner was in Emerson, in the parallel Department of Social Relations, but that is another story), to set up their Center for Cognitive Studies in rented quarters off Harvard Square, at the corner of Bow and Arrow Streets (Cohen-Cole, 2007).

Nevertheless, the 1960s were a period of rapid development on many fronts (Baars, 1986; Gardner, 1985; Hirst & Miller, 1988). Of special importance, Ulric Neisser published his seminal monograph on cognitive psychology, creating the textbook infrastructure for new courses on cognition, while the journal of the same name provided an outlet for publication of cognitive research.

Slide9.JPG (35177 bytes)Developments continued into the 1970s. Of particular importance was the support of the Sloan Foundation, which was keenly interested in promoting the leading edges of scientific progress. It established a program in cognitive science in 1976, and beginning in 1979 supported the development of training programs in cognitive science at a number of institutions, including UC Berkeley.

Slide10.JPG (37213 bytes)The result of all this activity was to institutionalize an interdisciplinary field of cognitive science, paralleling, and competing with, cognitive psychology. The Sloan Foundation was quite clear on this point: cognitive science was not to be a wholly owned subsidiary of psychology. Rather, it was conceived as "an autonomous science of cognition" -- an interdisciplinary effort bringing together a number of separate disciplines -- not just psychology and computer science, but philosophy, linguistics, neuroscience, and anthropology into the mix as well. It was to be, in Howard Gardner's phrase, "the mind's new science", with each of the fields comprising "the cognitive hexagon" making its own unique contribution to the emergent whole (Gardner, 1985).


Surveying the Cognitive Hexagon

Slide11.JPG (30570 bytes)Philosophy sits at the apex of the cognitive hexagon. This is, of course, where it all began, with epistemological concerns for the nature of knowledge and knowing, including the debate between nativists and empiricists. Philosophers also gave us the mind-body problem to contend with: how mind, or at least consciousness, could emerge from physical particles interacting in fields of force. Viewing the field in 1985, Howard Gardner asserted that the role of philosophers in cognitive science was merely to ask the questions and check the answers, and they have played that role well.

In the beginning, philosophers addressed these questions using the stock tools of their trade: introspection, reasoning, and, later, linguistic analysis. But more recently, the boundary between philosophical and scientific inquiry has been weakened, and one of the characteristics of contemporary philosophical inquiry into cognition is that philosophers are now interested in the actual results of empirical research -- so much so that some philosophers have argued that, in the new "neurophilosophy", the ordinary mental language of "folk psychology" -- percept, memory, belief, desire, and the like -- will be replaced by the constructs of neuroscience. We'll see.

Slide12.JPG (37625 bytes)For its part, psychology offers cognitive science a wide variety of experimental methods by which we can study cognitive processes empirically. This was true even before the cognitive revolution, where what used to be called "experimental psychology" focused its attention on problems of sensation and perception, learning, and memory (in the form of verbal learning). But in the wake of the cognitive revolution, psychology has added an increasingly sophisticated body of theory concerning how knowledge is acquired and represented in the mind, and how new knowledge can be generated through processes of reasoning, problem-solving, judgment, and decision-making.

But the cognitive perspective has permeated the approach of psychology to even its traditional topics. In the psychology of sensation, for example, the early (and, in my view, proto-behavioristic) emphasis on stimulus intensity and thresholds for sensation has been replaced by the theory of signal detection, which emphasizes the expectations and motives of the observer. In perception, the cognitive view stresses the inherent ambiguity of the stimulus, and the need for the observer to go "beyond the information given" by the stimulus, drawing on knowledge, expectations, and inferences to fill in the gaps, and construct a mental representation of the object or event in the environment. In memory, the injunction by Frederick Bartlett (1932) that "the psychologist, of all people, must not stand in awe of the stimulus" led to a view of the memory trace as similarly incomplete and ambiguous, and remembering as reconstructive activity involving problem-solving, reasoning, and inference. Even in animal learning, the earlier emphasis on the passive formation of associations between stimulus and response was replaced by a view of the behaving organism actively attending to unexpected stimuli, and seeking to predict and control events in its environment. The cognitive revolution began by offering an alternative to psychology, but at this point psychology is thoroughly imbued with the cognitive point of view.

Slide13.JPG (30016 bytes)Similarly, linguistics offers a set of specialized methods for the study of language. Traditionally, language was conceived as a particularly powerful means of communication, but the cognitive revolution in linguistics, initiated by Noam Chomsky, has moved the field far beyond that limited conception. In the first place, language is not just a means of communication: it is also a particularly powerful tool for thought, with words as symbolic representations, and the rules of syntax as a means of combining familiar concepts into entirely new thoughts. By affording us the ability to speak, and communicate, ideas that have never been thought -- much less spoken -- before, Chomsky persuasively argues that the capacity for language is the basis of human freedom.

While Chomsky stressed the strict separation of syntax from semantics (and focused his attention on the former), a movement in "cognitive linguistics" reasserted the communicative function of language, and argued for the priority of meaning, or semantics, over syntax. While Chomskian linguistics thinks of words as symbols whose meaning is constituted by a list of features, cognitive linguistics argues that words derive their meaning from the way they are used in communication: meaning is not given by the word, but rather is a matter of frame and metaphor. Moreover, while Chomskian linguistics thinks of language as following its own set of rules, cognitive linguistics thinks of language as a communication skill not unlike other cognitive skills. Finally, through the "speech act" theories promoted by John Searle and others, we have come to understand that language is not just a form of thinking, but also a form of action by which we can manipulate and transform the world around us, bringing the world in line with our ideas.

Slide14.JPG (36270 bytes)From the beginning, computer science, with its flowcharts of boxes and arrows, provided a means of conceptualizing the components of the mind and how they related to each other. But computer science has also offered the computer program itself as a medium for writing cognitive theory: Herbert Simon has long insisted that you don't really understand a cognitive process unless you have written an operating computer program to simulate it. Many of the contributions of computer revolve around the quest for artificial intelligence, and here I will just point out that there is an distinction between what John Searle has called "Weak AI", in which the computer serves merely as a medium for writing more complicated theories that would be possible verbally, or even mathematically; and "Strong AI", which seeks to duplicate human mental states and processes in a machine made of silicon chips. While Weak AI has been enormously successful, the jury is still out on Strong AI, with critics like Searle pointing out that the mere duplication of input-output relations does not mean that the intervening processes are identical in mind and machine. More recently, we have seen the emergence of what I have called "Pure AI", which tries to get machines to perform "intelligent" tasks without regard for how humans do it. A good example is "Deep Blue" (technically, a refined version known as "Deeper Blue") which in 1997 edged out Gary Kasparov, the reigning world chess champion. It was a stunning accomplishment, the Holy Grail of artificial intelligence; but IBM's software developers had absolutely no interest in simulating how a human grandmaster approaches the game.

Related to Pure AI is the a distinction between what John Haugeland (Haugeland, 1985) called "Good Old Fashioned Artificial Intelligence (GOFAI), based on traditional computer architectures, and the connectionist architectures which have become popular in machine vision and robotics. In GOFAI, knowledge is represented symbolically in strings of 0s and 1s that have a discrete address within the computer's memory; and then these symbols are manipulated by the computer's program. But in connectionist architectures, there is no distinction between symbols and rules: instead, knowledge is represented by patterns of activation distributed over a large number of processing elements. Connectionist architectures are sometimes described as more "neurally plausible" than symbol-and-rule architectures, and from the point of view of Pure AI, they certainly are very powerful systems for learning. But it remains to be determined whether they capture the essence of how the human mind performs its cognitive functions (McClelland & Patterson, 2002a, 2002b; S. Pinker & M. Ullman, 2002; S. Pinker & M. T. Ullman, 2002; Pinker & Ullman, 2003).

Slide15.JPG (29327 bytes)If, as Howard Gardner put it, the job of the philosophers is to ask the questions and check the answers, I suppose that the job of the neuroscientists is to figure out how the brain does it. Beginning with the assumption that the brain is the physical basis of mind -- that, as Steve Kosslyn has put it, "the mind is what the brain does" -- cognitive neuroscientists seek to understand how mental states and processes relate to states and processes in the brain.

Early in the evolution of cognitive science, the neuroscience component was largely focused on the neuron itself, and the analogy between the all-or-none property of neural discharge and the representation of information in strings of 0s and 1s. At higher levels of analysis, though, the search for neural correlates of particular cognitive functions was inhibited by Lashley's (Lashley, 1950) "Law of Mass Action", which asserted that higher cognitive functions were performed by an "association cortex" acting as a whole. But the discovery in the 19th century of Broca's and Wernicke's areas, seemingly specialized for speech and language, already threatened Lashley's law. And beginning in the 1950s, with the neurosurgical patient known as H.M., cognitive neuropsychologists discovered a whole host of discrete brain areas apparently specialized for various functions. The search for such centers has been greatly abetted by the developing of technologies such as PET and fMRI, which for the first time enable researchers to view the brain in action as subjects perform various tasks. This program is not without its controversies -- not least because the interpretation of the brain image is only as good as the experimental methodology used to generate it. But contemporary cognitive neuroscience seems close to fulfilling its promise of revealing the neural mechanisms underling complex cognitive functions. Cognitive neuroscience is now dominated by the doctrine of modularity, which asserts (among other things) that various cognitive functions are performed by dedicated neural centers and systems (Fodor, 1983). And, indeed, it seems that hardly a day goes by without some cognitive neuroscientist reporting that some cognitive task is performed by, or in, some discrete clump of brain tissue.

Slide16.JPG (36186 bytes)The sixth and final point in the Cognitive Hexagon is anthropology -- which, as Boden (Boden, 2006) has pointed out, is the "missing" or "unacknowledged" discipline of cognitive science. Cognitive science programs are full of philosophers, psychologists, linguists, computer scientists, and neuroscientists, but hardly any of them have any anthropologists, and those that do don't have very many. Which is too bad, because at first glance one would think that anthropology would be central to the field: after all, one way to think about culture is as a body of knowledge and belief shared by a group of people, passed through social learning from one generation to the next. Much early cognitive anthropology was stimulated by Western contact with non-western cultures, and an interest in exploring differences in thought processes between members of "developed" and "primitive" cultures. Later, Soviet historicism added the idea that economic and political development would change the way people thought. There was also the "Sapir-Whorf" hypothesis that language, clearly a part of culture, constrains (or at least influences) thought.

Ta great extent, Chomskian linguistics put a damper on cognitive anthropology, by virtue of its assertion that all languages are fundamentally the same. If this were so, then there is no point in seeking ways in which people who speak different languages might also think differently. And there were also empirical problems: it turned out that Eskimos don't really have 7 different words for snow; and if they do, do White, Anglo-Saxon, Protestants skiing at Alpine Meadows probably have at least as many (Martin, 1986; Pullum, 1989). More substantively, for example, despite the fact that different languages have different numbers of color terms, everyone makes the same discriminations among colors. Still, partly under the influence of post-Chomskian cognitive linguistics, we have come to understand that language does shape thought, by providing metaphors by which thought is expressed, and frames by which expressions are interpreted (Eve Sweetser, in her talk, will probably have more to say about this). There may be something to a weak version of the Sapir-Whorf hypothesis after all. Moreover, under the auspices of a revived "cultural psychology", we have begun to understand that there may be cultural diversity in thought processes after all -- for example, Euro-Americans may prefer linear, and East Asians "dialectical", forms of thinking (Nisbett, 2003). Again, much of this work is highly controversial, and in the same way that it might not be exactly true that Eskimos have seven words for snow, it might not be exactly true that "white men can't contextualize" (Shea, 2001).

Another positive development is that there has begun to be a cognitive revolution in social sciences other than anthropology (Turner, 2001). Economics, for example, has moved from the abstract principles that characterize the neoclassical view to a new "behavioral economics" that tries to understand how people actually think about money. And a new breed of "cognitive sociologists" have become interested in such matters as the social organization of knowledge -- how knowledge is distributed within groups, status differences in access to knowledge, and the like (Swidler & Arditi, 1994)); how knowledge is acquired, represented, and used by social organizations and institutions (Zerubavel, 1997); and, indeed, how certain aspects of reality are constructed through collective cognitive activity (Searle, 1995).


What Cognitive Science Is Not

So that, in a nutshell, is what cognitive science is: an interdisciplinary activity dedicated to understanding the acquisition, representation, and use of knowledge. But to further characterize the field, I should also say a few things about what cognitive science is not.

Slide17.JPG (37814 bytes)First, cognitive science is not neuroscience, because there are many varieties of neuroscience that don't have any interest in cognition per se. But cognitive science isn't even cognitive neuroscience. Neuroscience has its proper place in cognitive science, with its concern for the biological substrates of cognitive processes, how information is represented in the brain, what can be learned from brain-damaged patients, and specialized methodologies for brain-imaging. There are those who argue that neuroscience leads cognitive science, because knowledge of the structure of the nervous system will constrain theories of cognition at higher levels. Maybe, although it has to be said that there are no good examples of such constraint in the literature so far (Coltheart, 2006a, 2006b; Hatfield, 1988, 2000; Henson, 2005, 2006; Kihlstrom, 2007a, 2007b). But, in fact, the reverse seems to be true: the proper interpretation of brain function can come only when we have achieved a correct understanding of cognitive function at the psychological level of analysis. Or, as I like to put it: psychology without neuroscience is still psychology, while neuroscience without psychology is just neuroscience.

Slide18.JPG (34452 bytes)Second, cognitive science is not psychology, or even cognitive psychology. Recall that cognitive science was founded in the first place because academic psychology provided no institutional home for the study of cognition. That's all in the past now: contemporary psychology is thoroughly cognitive, almost to a fault. In some sense, psychology is broader than cognitive science, because it encompasses emotion, motivation, and behavior as well as cognition. But in another sense, cognitive science is broader than psychology, because it is a truly interdisciplinary activity in which psychologists work alongside others -- linguists, philosophers, neuroscientists, computer scientists, and anthropologists and other social scientists -- to solve problems of knowledge acquisition, representation, and use.

Slide19.JPG (32336 bytes)Third, at least from my point of view, the "cognitive" in cognitive science is not a euphemism for the mental. Cognitive science began with the insight that, in order to understand behavior, we had to understand the internal structures and processes that mediated between stimulus and response. Some of these structures and processes are cognitive in nature, but cognition isn't all there is to the mind. There's also emotion and motivation. In the wake of the cognitive revolution, some cognitive scientists have gone so far as to assert the hegemony of the cognitive -- that emotion and motivation are entirely derived from cognition, that feelings and desires are, essentially, beliefs about what we feel and desire. Maybe. On the other hand, Kant -- not to mention Plato -- argued that feelings and desires are irreducible -- that they have an ontological status that is independent of knowledge and cognition.

In the present state of the science, that seems like an equally viable hypothesis, and in fact we can now see the emergence of a separate "affective science", or "affective neuroscience", modeled on cognitive science (and cognitive neuroscience), but independent of it (Davidson, 2000; Lane & Nadel, 2000; Panksepp, 1998). A conative science, focused on motivation, cannot be far behind! Cognitive science is best construed as a specialized science of knowledge -- lest it become overbroad. Doubtless, cognitive, emotional, and motivational states interact with each other. Bruner's "New Look" in perception was based on the proposition that emotion and motivation influenced perception and other cognitive processes (J. Bruner, 1992, 1994; J. S. Bruner & Klein, 1960). Some emotional states to appear to flow from cognitive appraisals (Smith & Ellsworth, 1985), and people can regulate their emotions through cognitive transformations (Lazarus, 1991). Those relations deserve study too. But the people who study cognition, emotion, and motivation, all aspects of mental life together, are probably better called psychologists.


What Does Cognitive Science Offer Religion?

So what can cognitive science do for religion -- or, at least, for the study of religion?

Slide20.JPG (20143 bytes)First, the question might be rephrased as: What can cognitive science do to religion. It's pretty clear that cognitive science has provided new intellectual support for the village atheists among us (apologies to Masters, 19916). Previous arguments against religion were based on the problem of evil in the world, or dissections of proofs for the existence of God (I have to say, for myself, that I always thought that St. Anselm's proof was little more than a debater's trick). But cognitive science adds a new dimension to arguments from the outside. For Sigmund Freud religion was an illusion, a product of collective neurosis -- and thus a disorder of emotion (Freud, 1927/1968). But for Richard Dawkins (Dawkins, 2006), the evolutionary biologist, belief in God is a delusion -- a disorder of cognition. And Daniel Dennett (Dennett, 2006), the cognitive scientist, shows us how the delusion actually works: you don't believe in God; you just believe you do, as a result of adopting the intentional stance. For Jesse Bering, to take another example, religious belief may have been evolutionarily adaptive, but it is a mistake nonetheless -- an inappropriate generalization of the "theory of mind" by which we try to understand the thoughts and intentions of other people, to find intelligence and intentionality in the natural world as well (Bering, 2006a, 2006b).

Slide21.JPG (30782 bytes)On the positive side, cognitive science would seem to provide a useful theoretical and methodological apparatus to address questions that are central to religion. For example: What is the nature of religious belief? Philosophers and other cognitive scientists generally identify "belief" with any representational mental state, which combines with some proposition to generate what Russell called a propositional attitude. So, when someone says, "I believe in God", how does that differ from saying "I believe that it is raining outside"? Put bluntly: What is knowledge of God knowledge of?

How is knowledge of God represented in the mind? Cognitive science distinguishes among different types of representations, such as perception-based representations, which preserve knowledge about the physical structure of an object but not its meaning; and meaning-based representations, which go beyond physical description to include semantic and conceptual features. Some religions have very rich representations of God (or of gods), others do not. Does the nature of the representation have any consequences for the nature of the belief? Or vice-versa?

How is knowledge of God acquired? Cognitive science offers us two views of this matter, nativism and empiricism, plus a combination of the two. In some religions, God acts in history; in others, not so much: does this difference have consequences for the nature of religious belief? Moreover, some believers have a direct experience of God, whereas for others, religious belief is acquired vicariously, through precept or example. Was there a difference in religious belief between Moses, who actually received the Ten Commandments from God, and the rest of the Israelites, who were simply told about them (Exodus 19)? Or between Paul, who encountered the risen Christ on the road to Damascus (1 Corinthians 15:1-11), and the Corinthians, to whom he related the story?

What is the nature of religious experience itself? Cognitive science may help us to analyze the cognitive components of the experience, in terms of sensation, perception, memory, thought, and language. But cognitive science may not be enough to encompass such a topic, and it may have to be supplemented by affective science (or, at least, the rest of psychology), to understand the emotional dimensions as well. William James clearly thought so, which is why he gave so much space to the affective aspects of religious experience (James, 1902/1985).

Slide22.JPG (21470 bytes)It is important to understand that, in these respects, different religions may require different cognitive analyses. For example, H. Allen Orr (Orr, 2007), expanding on an idea of Philip Kitcher (Kitcher, 2007), has offered a tentative taxonomy of religions: providentialist, which holds that the universe was created by a benevolent God to whom the believer can pray; fundamentalist, asserting the literal truth of sacred texts ; supernaturalist, not so closely linked to literal interpretation; deist, based on the idea that there is a mind at the base of the material universe; spiritual religion, which focuses on ethical behavior and examples of lives rightly lived; and finally, secular humanism, which is uncomfortably close to spiritual religion but abjures any and all deities entirely. As Orr and Kitcher show, the various kinds of religion respond differently to the theory of evolution and other Enlightenment critiques. It may be that the cognitive science of religion will depend on what kind of religion it is a cognitive science of - -that is, what the adherents of that religion actually believe.


What Does Religion Offer Cognitive Science?

Slide23.JPG (25293 bytes)But I don't think that the relationship between cognitive science and religion is a one-way street. Religion can also make a positive contribution to cognitive science, by offering a unique perspective on certain topics. For example, let us return to the nature of religious belief. As I noted earlier, cognitive science usually thinks of belief as an umbrella term for all sorts of cognitive (and other mental) states; but religion reminds us that there is are real phenomenological and epistemological distinctions between believing something, on the one hand, and knowing, or perceiving, or imagining, or remembering, or thinking something, on the other. Beliefs are often defined as convictions in the truth of some statement, independent of, or in the absence of, sufficient evidence. Then there is the distinction between the belief that something is true, and the belief in something. Belief that, in whatever form it takes, always requires some sort of justification. But religious faith is described by St. Paul as the substance of things hoped for, the evidence of things unseen" (Hebrews 1:1). Believing that it's raining is not the same thing as knowing or perceiving that it's raining; but believing in God is not the same thing as believing that it's raining, and believing in God isn't the same thing as believing in Santa Claus, either. Cognitive scientists might get more clarity on these kinds of distinctions if they would undertake a serious, sympathetic inquiry into religious beliefs.

As another example, cognitive science is intensely concerned with the nature and function of consciousness. Altered states of consciousness offer one venue for consciousness, and there has lately been a resurgence of interest in the effects of meditation on consciousness. Much of this research has been focused on Hindu or Buddhist meditative practices, and much of it has been more concerned with physiology than with cognition (I say this as one who spent altogether too much of his junior year in college pestering Shibayama Roshi, abbot of Nanzen-ji, to allow me to slap electrodes on his head while be practiced zazen). Religious scholars remind us that there are meditative practices in Western religious traditions, too, and they may be equally deserving of our attention. Shibayama's response to my entreaties was to ask whether I would make the same request of the Pope while he said Mass. This was, I think, my own personal koan -- or maybe it was my own personal keisaku, the stick that Zen masters use to keep novices awake. Contemplative prayer is a part of Jewish, Christian, and Muslim traditions as well -- it is, as St. Teresa reminds us, the way we gain intimate knowledge of God; and I think that our ignorance of these traditions reflects a kind of scientific Orientalism on our part.

Moreover, just as research on cognition and culture would benefit from deeper understanding of the cultures from which the subjects are drawn, research on meditation would benefit from a deeper understanding of the religious context in which the act takes place. Meditation, ripped out of its religious context, may not be the same thing as the same practice in context.

Slide24.JPG (32169 bytes)It is also important to understand that EEG tracings and brain images are not self-evident; only by knowing what purpose the meditation serves can we really understand what is going on in the brain: It makes a difference whether the goal of meditation is mindlessness or mindfulness, "unconditional loving-kindness and compassion" (Lutz, Greischar, Rawlings, Ricard, & Davidson, 2004, p. 16369) or the enlightened state of satori. If you're going to understand the cognitive science of religion, you've got to get the religion right: and this is no business for amateurs. In this respect, Shibayama gave me another koan: even before he asked me about the Pope at Mass, he kept asking "What would it mean?". I didn't have an answer then, but I think I have it now. The meaning of meditation isn't going to be found in EEG tracings or illuminated pixels. Rather, those neural measures are given meaning by the cognitive goal of the exercise. If the cognitive goal of Zen meditation is the de-automatization of everyday thought patterns (Deikman, 1966) -- well, cognitive science knows how to measure that, and it's not with fMRI (see Postscript). If the goal of Christian prayer is to gain intimate knowledge of God, as St. Teresa said -- well, perhaps not so much. But the larger point is that the cognitive goal of meditation will differ from one religion to the other. If you're going to understand the cognitive science of religion, you've got to get the religion right: and this is no business for amateurs.


Points of Departure

There are surely other ways in which cognitive science can contribute to religion, and in which religion can contribute to cognitive science. I may not be religious enough (or, for that matter, enough of a cognitive scientist!) to think of all of them. But I think these points of contact provide a useful starting-place for joint inquiry -- an inquiry that, I think, only makes sense if religious belief, and experience, is taken seriously by cognitive scientists -- and not just taken as something to be dismissed or explained away.


Postscript

OK, so how do we measure de-automatization? Back in 1966, when Deikman was writing about this, "de-automatization" was a pretty vague term. That's all changed now, as "automaticity" has a fairly precise technical meaning in cognitive psychology. In general, we can say that an automatic process has four features:


inevitable evocation by an environmental stimulus;


incorrigible completion once evoked;


efficient execution, meaning that it consumes few or no attentional resources; and


parallel processing, meaning little or no interference with, or by, other processes.

066Stroop.jpg (100725 bytes)The classic demonstration of automaticity is the Stroop test, in which subjects are asked to name the colors in which words are printed, ignoring the meaning of the words themselves. Subjects find this hard to do when the words are color names, and especially when the color names conflict with the colors. The traditional explanation of this effect is that, for skilled readers, reading words has become automatized -- we just can't help it, and this automatic reading interferes with the task of naming colors.

So, if one of the consequences of meditation is the de-automatization of thought processes, we would expect that meditators would show reduced interference on the Stroop test.

069AlexanderStroop.jpg (39569 bytes)And that's exactly what Alexander found in a study of Transcendental Meditation.





More systematic research by Wenk also showed that a secular (non-TM) meditation exercise also reduced Stroop interference.
072Wenk1Pre.jpg (53215 bytes) 073Wenk1Post.jpg (52205 bytes) 074Wenk1Change.jpg (54367 bytes)
076Wenk2aPre.jpg (51844 bytes) 077Wenk2aPost.jpg (51720 bytes) 078Wenk2aChange.jpg (50718 bytes)

Wenk also examined the effects of meditation on the generation of category instances. Initially, she hoped that meditation would lead to a "freeing up" of thought, manifested in a tendency to generate less frequent, more atypical instances.

In her first attempt, this didn't work out.
080Wenk2bPre.jpg (50981 bytes) 081Wenk2bPost.jpg (51250 bytes) 082Wenk2bChange.jpg (48054 bytes)

However, a later experiment found enhanced the production of atypical instances during a category generation task -- but only when subjects were specifically instructed to produce atypical as opposed to typical instances.
084Wenk3Typical.jpg (49707 bytes) 085Wenk3Atypical.jpg (46964 bytes)

Now here is a contribution of religion to cognitive science: Implicit in the standard concept of automaticity is the idea that automaticity, whether innate or achieved through extensive practice, is permanent. By contrast, meditation research seems to indicate that automatization can be reversed.

The only fly in the ointment is that neither Alexander nor Wenk employed a meditation exercise that resembles zazen. Alexander used Transcendental Meditation, and Wenk used another yoga-like exercise that focused on breathing. Again, you've got to pay attention to the details of the religious discipline and its cognitive goals. But the larger point is that the cognitive effects of meditation are a place where religion and cognitive science can meet on common ground.


References

Alexander, C. N., Langer, E. J., Newman, R. I., Changler, H. M., et al. (1989). Transcendental Meditation, mindfulness, and longevity: An experimental study with the elderly. Journal of Personality & Social Psychology, 57(6), 950-964.

Baars, B. J. (1986). The cognitive revolution in psychology. New York: Guilford Press.

Bechtel, W., & Graham, G. (Eds.). (1998). A companion to cognitive science. Malden, Ma.: Blackwell.

Bering, J. M. (2006a). The cognitive psychology of belief in the supernatural. American Scientist, 94(2), 142-149.

Bering, J. M. (2006b). The folk psychology of souls. Behavioral & Brain Sciences, 29, 453-498.

Boden, M. A. (2006). Mind as machine: A history of cognitive science. Oxford: Oxford University Press.

Boring, E. G. (1950). A history of experimental psychology (2nd ed.). New York: Appleton-Century-Crofts.

Bruner, J. (1992). Another look at New Look 1. American Psychologist, 47, 780-783.

Bruner, J. (1994). The view from the heart's eye: A commentary. In P. M. Niedenthal & S. Kitayama (Eds.), The heart's eye: Emotional influences in perception and attention (pp. 269-286). San Diego: Academic Press.

Bruner, J. S., & Klein, G. S. (1960). The function of perceiving: New Look retrospect. In W. Wapner & B. Kaplan (Eds.), Perspectives in psychological theory: Essays in honor of Heinz Werner (pp. 61-77). New York: International Universities Press.

Cohen-Cole, S. (2007). Instituting the science of mind: Intellectual economies and disciplinary exchange at Harvard's Center for Cognitive Studies. British Journal for the History of Science.

Coltheart, M. (2006a). Perhaps functional neuroimaging has not told us about the mind (so far)? Cortex, 42, 422-427.

Coltheart, M. (2006b). What has functional neuroimaging told us about the mind (so far)? Cortex, 42, 323-331.

Davidson, R. J. (2000). Cognitive neuroscience needs affective neuroscience (and vice versa). Brain & Cognition, 42(1), 89-92.

Dawkins, R. (2006). The god delusion. New York: Houghton Mifflin.

Deikman, A.J. (1966). De-automatization and the mystic experience. Psychiatry, 29, 334-348.

Dennett, D. L. (2006). Breaking the spell: Religion as a natural phenomenon. New York: Viking.

Fodor, J. A. (1983). The modularity of the mind. Cambridge, Ma.: MIT Press.

Freud, S. (1927/1968). The Future of an Illusion. In J. Strachey (Ed.), The Standard Edition of the Complete Psychological Works of Sigmund Freud (Vol. 21). London: Hogarth Press.

Gardner, H. (1985). The mind's new science : A history of the cognitive revolution. New York: Basic Books.

Greenwood, J. D. (2003). Wundt, Volkerpsychologie, and Experimental Social Psychology. History of Psychology, 6(1), 70-88.

Hatfield, G. (1988). Neuro-philosophy meets psychology: Reduction, autonomy, and physiological constraints. Cognitive Neuropsychology, 5, 723-746.

Hatfield, G. (2000). The brain's "new" science: Psychology, neurophysiology, and constraint. Philosophy of Science, 67(Proceedings), S388-S403.

Haugeland, J. (1985). Artificial intelligence: The very idea. Cambridge, Ma.: MIT Press.

Henson, R. (2005). What can functional neuroimaging tell the experimental psychologist? Quarterly Journal of Experimental Psychology, 58A, 193-233.

Henson, R. (2006). What has neuropsychology told us about the mind (so far)? Cortex, 42, 387-392.

Hirst, W., & Miller, G. A. (1988). The Making of cognitive science : essays in honor of George A. Miller. Cambridge ; New York: Cambridge University Press.

James, W. (1902/1985). The varieties of religious experience. Cambridge, Ma.: Harvard University Press.

Kihlstrom, J. F. (2007a). Does neuroscience constrain social-psychological theory? [Expanded version]. Retrieved April 25, 2007, from http://socrates.berkeley.edu/~kihlstrm/SPSPDialogue06.htm

Kihlstrom, J. F. (2007b, May). Social neuroscience: The footprints of Phineas Gage. Paper presented at the Keynote address presented at a conference on "The Neural Bases of Social Behavior", Austin, Texas.

Kitcher, P. (2007). Living with Darwin: Evolution, design, and the future of faith. New York: Oxford University Press.

Lane, R. D., & Nadel, L. (Eds.). (2000). The cognitive neuroscience of emotion. New York: Oxford University Press.

Lashley, K. S. (1950). In search of the engram. In Symposia of the Society for Experimental Biology (Vol. 4, pp. 454-482). New York: Cambridge University Press.

Lazarus, R. S. (1991). Cognition and motivation in emotion. American Psychologist, 46, 352-367.

Lutz, A., Greischar, L. L., Rawlings, N. B., Ricard, M., & Davidson, R. J. (2004). Long-term meditators self-induce high-amplitude gamma synchrony during mental practice. Proceedings of the National Academy of Sciences, 101, 16369-16373.

Martin, L. (1986). "Eskimo words for snow": A case study in the genesis and decay of an anthropological example. American Anthropologist, 88, 418-423.

Masters, E. L. (19916). The village atheist. In Spoon River anthology. New York: Macmillan.

McClelland, J. L., & Patterson, K. (2002a). Rules or connections in past-tense inflections: what does the evidence rule out? Trends in Cognitive Sciences, 6(11), 465-472.

McClelland, J. L., & Patterson, K. (2002b). `Words or Rules` cannot exploit the regularity in exceptions. Trends in Cognitive Sciences, 6(11), 464-465.

Nadel, L. (Ed.). (2003). Encyclopedia of cognitive science. London: Macmillan Publishers.

Nisbett, R. (2003). The Geography of Thought : How Asians and Westerners Think Differently...and Why. New York: Free Press.

Orr, H. A. (2007). A religion for Darwinians? [review of Living with Darwin: Evolution, Design, and the Future of Faith by P. Kitcher]. New York Review of Books, 33-35.

Osherson, D. N. (Ed.). (1995). An invitation to cognitive science (2nd ed.). Cambridge, Ma.: MIT Press.

Panksepp, J. (1998). Affective neuroscience: The foundations of human and animal emotions. New York, NY, US: Oxford University Press.

Pinker, S., & Ullman, M. (2002). Combination and structure, not gradedness, is the issue. Trends in Cognitive Sciences, 6(11), 472-474.

Pinker, S., & Ullman, M. T. (2002). The past and future of the past tense. Trends in Cognitive Sciences, 6(11), 456-463.

Pinker, S., & Ullman, M. T. (2003). Beyond one model per phenomenon. Trends in Cognitive Sciences, 7(3), 108-109.

Pullum, G. K. (1989). The great Eskimo vocabulary hoax. Natural Language & Linguistic Theory, 7, 275-281.

Searle, J. R. (1995). The construction of social reality. N.Y.: Free Press.

Shea, C. (2001). White men can't contextualize. Lingua Franca, 11(6).

Smith, C. A., & Ellsworth, P. C. (1985). Patterns of cognitive appraisal in emotion. Journal of Personality & Social Psychology, 48(4), 813-838.

Sobel, C. P. (2001). The cognitive sciences: An interdisciplinary approach. London: Mayfield.

Stillings, N. A., Weisler, S. E., Chase, C. H., Feinstein, M. H., Garfield, J. L., & Rissland, E. L. (1995). Cognitive science: An introduction (2nd ed.). Cambridge, Ma.: MIT Press.

Swidler, A., & Arditi, J. (1994). The new sociology of knowledge. Annual Review of Sociology, 20, 305-329.

Thagard, P. (2005). Mind: An introduction to cognitive science (2nd ed.). Cambridge, Ma.: MIT Press.

Turner, M. (2001). Cognitive Dimensions of Social Science: The Way We Think About Politics, Economics, Law, and Society. New York: Oxford University Press.

Wenk-Sormaz, H. (2005). Meditation can reduce habitual responding. Advances in Mind-Body Medicine, 21, 33-49.

Wilson, R. A., & Keil, F. C. (Eds.). (2001). MIT Encyclopedia of Cognitive Science. Cambridge, Ma.: MIT Press.

Woodworth, R. S. (1921). Psychology: A study of mental life. New York: Holt.

Wozniak, R. H. (1992). Mind and body: Rene Descartes to William James. Bethesda, Maryland and Washington, D.C.: National Library of Medicine and American Psychological Association.

Zerubavel, E. (1997). Social mindscapes : An invitation to cognitive sociology. Cambridge, Mass.: Harvard University Press.

Friday, December 3, 2010

Artificial intelligence application

What we can do with AI

We have been studying this issue of AI application for quite some time now and know all the terms and facts. But what we all really need to know is what can we do to get our hands on some AI today. How can we as individuals use our own technology? We hope to discuss this in depth (but as briefly as possible) so that you the consumer can use AI as it is intended.

First, we should be prepared for a change. Our conservative ways stand in the way of progress. AI is a new step that is very helpful to the society. Machines can do jobs that require detailed instructions followed and mental alertness. AI with its learning capabilities can accomplish those tasks but only if the worlds conservatives are ready to change and allow this to be a possibility. It makes us think about how early man finally accepted the wheel as a good invention, not something taking away from its heritage or tradition.

Secondly, we must be prepared to learn about the capabilities of AI. The more use we get out of the machines the less work is required by us. In turn less injuries and stress to human beings. Human beings are a species that learn by trying, and we must be prepared to give AI a chance seeing AI as a blessing, not an inhibition.

Finally, we need to be prepared for the worst of AI. Something as revolutionary as AI is sure to have many kinks to work out. There is always that fear that if AI is learning based, will machines learn that being rich and successful is a good thing, then wage war against economic powers and famous people? There are so many things that can go wrong with a new system so we must be as prepared as we can be for this new technology.

However, even though the fear of the machines are there, their capabilities are infinite Whatever we teach AI, they will suggest in the future if a positive outcome arrives from it. AI are like children that need to be taught to be kind, well mannered, and intelligent. If they are to make important decisions, they should be wise. We as citizens need to make sure AI programmers are keeping things on the level. We should be sure they are doing the job correctly, so that no future accidents occur.
AIAI Teaching Computers Computers

Does this sound a little Redundant? Or maybe a little redundant? Well just sit back and let me explain. The Artificial Intelligence Applications Institute has many project that they are working on to make their computers learn how to operate themselves with less human input. To have more functionality with less input is an operation for AI technology. I will discuss just two of these projects: AUSDA and EGRESS.

AUSDA is a program which will exam software to see if it is capable of handling the tasks you need performed. If it isn't able or isn't reliable AUSDA will instruct you on finding alternative software which would better suit your needs. According to AIAI, the software will try to provide solutions to problems like "identifying the root causes of incidents in which the use of computer software is involved, studying different software development approaches, and identifying aspects of these which are relevant to those root causes producing guidelines for using and improving the development approaches studied, and providing support in the integration of these approaches, so that they can be better used for the development and maintenance of safety critical software."

Sure, for the computer buffs this program is a definitely good news. But what about the average person who think the mouse is just the computers foot pedal? Where do they fit into computer technology. Well don't worry guys, because us nerds are looking out for you too! Just ask AIAI what they have for you and it turns up the EGRESS is right down your alley. This is a program which is studying human reactions to accidents. It is trying to make a model of how peoples reactions in panic moments save lives. Although it seems like in tough situations humans would fall apart and have no idea what to do, it is in fact the opposite. Quick Decisions are usually made and are effective but not flawless. These computer models will help rescuers make smart decisions in time of need. AI can't be positive all the time but can suggest actions which we can act out and therefor lead to safe rescues.

So AIAI is teaching computers to be better computers and better people. AI technology will never replace man but can be an extension of our body which allows us to make more rational decisions faster. And with Institutes like AIAI- we continue each stay to step forward into progress.
No worms in these Apples

by Adam Dyess

Apple Computers may not have ever been considered as the state of art in Artificial Intelligence, but a second look should be given. Not only are today's PC's becoming more powerful but AI influence is showing up in them. From Macros to Voice Recognition technology, PC's are becoming our talking buddies. Who else would go surfing with you on short notice- even if it is the net. Who else would care to tell you that you have a business appointment scheduled at 8:35 and 28 seconds and would notify you about it every minute till you told it to shut up. Even with all the abuse we give today's PC's they still plug away to make us happy. We use PC's more not because they do more or are faster but because they are getting so much easier to use. And their ease of use comes from their use of AI.

All Power Macintoshes come with Speech Recognition. That's right- you tell the computer to do what you want without it having to learn your voice. This implication of AI in Personal computers is still very crude but it does work given the correct conditions to work in and a clear voice. Not to mention the requirement of at least 16Mgs of RAM for quick use. Also Apple's Newton and other hand held note pads have Script recognition. Cursive or Print can be recognized by these notepad sized devices. With the pen that accompanies your silicon note pad you can write a little note to yourself which magically changes into computer text if desired. No more complaining about sloppy written reports if your computer can read your handwriting. If it can't read it though- perhaps in the future, you can correct it by dictating your letters instead.

Macros provide a huge stress relief as your computer does faster what you could do more tediously. Macros are old but they are to an extent, Intelligent. You have taught the computer to do something only by doing it once. In businesses, many times applications are upgraded. But the files must be converted. All of the businesses records but be changed into the new software's type. Macros save the work of conversion of hundred of files by a human by teaching the computer to mimic the actions of the programmer. Thus teaching the computer a task that it can repeat whenever ordered to do so.

AI is all around us all but get ready for a change. But don't think the change will be harder on us because AI has been developed to make our lives easier.
The Scope of Expert Systems
As stated in the 'approaches' section, an expert system is able to do the work of a professional. Moreover, a computer system can be trained quickly, has virtually no operating cost, never forgets what it learns, never calls in sick, retires, or goes on vacation. Beyond those, intelligent computers can consider a large amount of information that may not be considered by humans.

But to what extent should these systems replace human experts? Or, should they at all? For example, some people once considered an intelligent computer as a possible substitute for human control over nuclear weapons, citing that a computer could respond more quickly to a threat. And many AI developers were afraid of the possibility of programs like Eliza, the psychiatrist and the bond that humans were making with the computer. We cannot, however, over look the benefits of having a computer expert. Forecasting the weather, for example, relies on many variables, and a computer expert can more accurately pool all of its knowledge. Still a computer cannot rely on the hunches of a human expert, which are sometimes necessary in predicting an outcome.

In conclusion, in some fields such as forecasting weather or finding bugs in computer software, expert systems are sometimes more accurate than humans. But for other fields, such as medicine, computers aiding doctors will be beneficial, but the human doctor should not be replaced. Expert systems have the power and range to aid to benefit, and in some cases replace humans, and computer experts, if used with discretion, will benefit human kind.

More essays to come. If you have a suggestion or a possible contribution
please comment

Wednesday, December 1, 2010

Chanakya a servant leader


The Brahmin lecturer in the Nalanda University, who not only uprooted the Nanda Dynasty & enthroned the candidate of his choice as well as the deserving lad(Politics & economics grad from Takshila& fav student of Chankya) & dasi putra - Chandragupta Maurya, but he also totally controled the then poitical situation to harmonise the & strengthen the nation.Chanakya is trully a marvel and a great thinker.He was a political master mind & thus was his NitiShastra so well acclaimed for centuries.He also, was very well versed in the possible businesses, the revenue generation, the administration policies, the state affairs, military needs of a nation, nation building etc, which in combination given birth to the second largely acclaimed & still relevant on most of the fields .. TREATISE ON ECONOMICS, Chanakya Sutra(or arthashastra)

Chanakya & the Magadh Dynasty
Chanakya has been attributed with the name of Kutil-Matih Kautilya, which means that he was man with twisted brain and intellect.
Thus he was named as Kautilya, by the think tanks of that era. Since he was the son and student of Acharya Chanak, therefore, he was also addressed as Chanakya. Otherwise, he was Vishnugupta for his parents, friends and the Gurukula community including all the official documents of the time.
Beyond doubt, he was an arch-diplomat. Being the Adhishthhata/Dean and Acharya/Professor of the faculty of Politacal and Economic Thought of the Brihaspati and Shukra Niti at the Takshashila, he was himself talented in the history,criminology, law and legal institutions along with the theory and applied economics. He was the chief architect of the foundation of the Maurya administration. He is known to the western world, the Indologists and the students of the comparative political thought either as Chanakya or Kautilya, because of his great contribution towards -a treatise on the Political economy - Artha Shastra
He was an economic strategist and applied those strategies for the welfare schemes, enhancement of revenue, and application of other schemes during Chandragupta's rule.
Magadha having strategically the strong base and centrally located, served the mission of Chanakya to see India free from any foreign invasion in future and to inculcate the spirit of Indian nationalism in the kings of smaller states under Chandragupta Maurya .
Chankya & idea of one nation
Chanakya was more interested in the cultural and political integration on the strong foundation of the socio-religious environment.
That was why, he believed that Shasatra / military strength, Shastra/ scriptural attainment should be combined for the welfare of the then India. As an educationist, he felt the necessity of the interdisciplinary courses and comparative studies, which must be in the higher curriculum. He made the two hands of a Shastra/ ruler, stronger.
The first component is the physical strength to combat any eventuality and the other is for intellectual analysis to live with peace for attaining progress and prosperity. A nation can forge ahead, when the internal peace prevails and can compete the challenges of the time, when the policy planners are properly trained in their respective fields. This leads in the specialization of the administrative network, under the capable but honest hands. Chanakya succeeded in implementing the secret agency lead by Bhagurayana and Siddharthaka faithfully. All the secret services were put to checks and cross checks. Both the Intelligence Chiefs conducted the missions in bringing political integration under the banner of Chandragupta Maurya with a feeling of "Bharata is one"
Chanakya - The Nation Builder
Acharya Chanakya being a great socio-political scientist at the university of Takshashila, became very upset with the invasion of Alexander, but did not loose his heart. He made conferences and seminars on the burning topic in the campus : Save Bharata from Invasion.
Because of his inborn genius and selfless dedication, he gained the support of the Kulapati/Chancellor of the university to hold such meetings in private and public to mobilize the human resource to see Alexander out of the territories of Bharata and to create the Central government to make Bharata militarily a strong nation in defending the borders of the Rashtra and for the internal peace, whenever need arises. He succeeded in his mission, through Chandragupta Maurya, who studied the military science as well and was appointed the Senapati Kartikeya/Commander of the Liberation force's with the head quarters at the university of Takshashila. He succeeded in his mission , through his political wisdom, diplomatic policies and the in-depth research in the Socio-political thought of India. In spite of his best efforts, Alexander got hold of the territory of the frontiers. His Kshatraps/military commanders got hold on the land.
After Alexander died during his return. He got Carnivallia- the daughter of Seleukos married to Chandragupta Maurya to make him much stronger in dealing with the Indo-Greek ties, both politically and diplomatically. This marriage took place under the treaty of "jus connubii" or the rights earned by marriage. It was the diplomatic vision of Chanakya that Seleukos defeated his sworn enemy Antigonus. In return Chandragupta was given the territory of Kabul,Gandhar and Herat. Then Chandragupta annexed the Punjab and Saurashtra. He was also given the port of Patal at the sea shore. The foremost aim of Chanakya was to see a strong Bharata to flourish, after the invasion of Alexander.
He succeeded in his dream, which he turned into reality.
The Arthashastra
The Artha Shastra is divided in 15 books known as Adhikarnas and divided into 180 sections known as Prakarnas, again each section is further divided into 150 Adhyayas/ chapters

Book 1 It deas with the discipline and training of the ruler/king, criminal penal code and its enforcement, eligibility of the ministers, administrative system and intelligence network system.

Book 2 It deals with the organizational system based on the bureaucratic set up. The duties and responsibilities, procedure and norms for the collection of taxes, trade and commerce.

Book 3 It deals with civil laws and administration.

Book 4 It deals with suppression of anti-social activities

Book 5 Action taking laws against sedition and treason, scales of pay and fixation of the expenses for the royal entourage, including the government officials

Book 6 It deals with the essential features of the state units, consisting of seven - . fold system

Book 7 It deals with the inter- state political and diplomatic relations.

Book 8 Measures against natural calamities and dangers at the time of calamities


Book 9 It deals with military campaigns and its refresher courses.

Book 10 It deals with ancillary problems in the defense.

Book 11 It deals with the economic planning and political institutions.

Book 12 It deals with research and analysis of the secret services

Book 13 Regarding the fresh set-up measures to be taken in a conquered country.

Book 14 Regarding the secret designs for the destruction of enemies

.Book 15 It is the glossary regarding the technical terms used in the Artha Shastra
The Duties of a King The Arthasastra
Kautilya - Prime Minister to Emperor Chandragupta Maurya (4th Century BCE)
How chanakya succeded to be the greatest and truly the one and only servant leader for generations.
The cause comes to identify chanakya as a true leader and why is there a need to usher in a servant and not a feudal leader in this conniving nexus of politicians and executive body of India.
The functions of a servant-leader

“True leadership emerges from those whose primary motivation is a deep desire to help others.”

Many leaders of our country advocated the injection of servant leadership into India’s systemic governance. In spite of the extensive damage that colonialism has done to capacity for holistic development, the continent can re-invent itself by effective leadership shorn of graft.

Crying over split milk serves no practical purpose. Elegiac anti-Western rhetorics will not build the continent good roads or feed and educate its rural poor. What should be done is stop buckpassing, yank off fatalism and negative rationalism, and face the challenges of India’s rebirth with servant leadership.

As a leadership approach, servant leadership will aid honest harnessing of the continent’s vast resources to forge its sustainable wealth and welfare. Also, it will ensure that leaders rule with Messianic motive unfettered by neo-colonial intrusion. Further, servant leadership will rid Africa of dictatorship as it runs on consensus instead of coercion, preferring collectivism to individualism.

The concept
As a concept, servant leadership was coined and defined by Robert K. Greenleaf (1904-1990) in his essay titled The servant as leader, published in 1970. Greenleaf had spent 40 years in the field of management research, development and education at AT&T. On retirement, he took up another career teaching and consulting at a number of major academic institutions and corporations, which included the Havard Business School, MIT, Mead Corporation and Ford Foundation.
Greenleaf’s idea of servant leadership might have been inspired by his experiences during his half-century work at the helm of large institutions. But what helped crystallise his thinking on the concept was the lesson he learnt reading Hermann Hesse’s novel, Journey to the East, which told the story of a people on a journey for a spiritual purpose. Writing in the journal, Leader to Leader, (No.34, Fall 2004), Larry Spears, who is the CEO of Robert K. Greenleaf Center for Servant Leadership, said “Greenleaf concluded that the great leader is first experienced as a servant to others, and that this simple fact is central to the leader’s greatness.” Spears continued: “True leadership emerges from those whose primary motivation is a deep desire to help others.”

The servant-leader concept offers an alternative approach to leadership practice – leading others by serving their mutual interest. The emphasis, in Spear’s words, is “increased service to others, a holistic approach to work, promoting a sense of community, and the sharing of power in decision making”. In recent years, servant-leadership as a leadership development concept has spawned scholarly writings by scholars such as Stephen Covey, Peter Block, Peter Senge, Max DePree, Margaret Wheatley, Ken Blanchard, John Sullivan, and others. Greenleaf’s works together with those of these leadership consultants have helped sustain the concept’s currency in leadership practice and win it many disciples in education and corporate worlds.

A servant-leader, according to Greenleaf, is “one who is a servant first”. He wrote: “It begins with the natural feeling that one wants to serve. Then conscious choice brings one to aspire to lead. The difference manifests itself in the care taken by the servant – first to make sure that other people’s highest priority needs are being served.” Greenleaf provides the test questions for servant-leadership. He asked: “Do those served grow as persons, do they, while being served, become healthier, wiser, freer, more autonomous, more likely themselves to be servants…?”

Greenleaf may be responsible for the coinage and definition of servant-leadership; but the idea wasn’t entirely his own. According to Wikipedia, “Chanakya or Kautilya, the famous strategic thinker from ancient India, wrote about servant leadership in his 4th century book, Arthashastra. Also, the Lord Jesus taught the concept when He said to His disciples, “Ye know that they which are accounted to rule over the Gentiles exercise lordship over them; and their great ones exercise authority upon them. But so it shall not be among you; but whosoever will be great among you, shall be your minister; and whosoever of you will be chiefest shall be servant of all” (Mark 10:42-45).

Wikipedia (2001) notes that servant-leadership, an “upside down leadership style, puts the needs of followers above the needs of the leader, promotes teamwork, individual dignity and worth, and results in a synergy of purpose unachievable with the old leadership models”. As organisational management style, servant leadership, works by “eschewing the common top-down hierarchical style and instead emphasising collaboration, trust, empathy and the ethical use of power … The objective is to enhance the growth of individuals in the organisation and increase teamwork and personal involvement”.

The characteristics
Larry Spears has extracted the characteristics of servant leaders from Greenleaf’s original writings. They are 11 in number. Some of these characteristics are inborn; while others can be developed through education and practice.
For example, characteristics such as calling, empathy, healing and stewardship are more of inherent personal traits than learned skills. A leader must have tested positive for these traits to be successful at servant-leadership practice. The remaining characteristics – listening, awareness, persuasion, conceptualisation, foresight, growth and building community can be learned and continually developed by servant-leaders.

Now, you can see that servant leadership is more than a wish. You must be both born and made for it. Writing in NebGuide (2002), leadership development specialists, John E. Garbuto and Daniel W. Wheeler, observed that servant leaders have a natural deeply-rooted value-based calling to serve others.

“A servant leader,” they asserted, “is willing to sacrifice self-interests for the sake of others”. They warned: “This characteristic cannot be taught, so unless a person has a natural calling to serve, servant leadership is not realistic or compatible style.” They counselled people to “reflect and thoughtfully assess the degree to which (they) have what it takes to be a servant leader”.

This counsel, however, should be directed at African nations’ electoral institutions and the continent’s hasty voters. What about testing election candidates for servant-leaders’ traits before they are cleared to contest? What about educating the voters about the principles and profits of servant leadership persuading them to vote for candidates with servant-leader potentials?

Africa’s political leadership will profit much from servant leadership. Check the records, and you will see that African leaders that turn their nations the right side up are those who operate with principles of leadership servanthood. Such leaders aren’t motivated to seek political office for increased power; but are out to fill the void in their people’s collective existence. No wonder, they are able to feed their nations. African organisations, too, should include servant-leaders’ traits in their criteria for management positions and design appropriate means to measure them.

The critics
The philosophy of servant-leadership has been challenged by some leadership scholars who have objected to the implications of its metaphoric paradox. One critic was apparently disturbed by the connotations of the term “servant” in the compound word, “servant-leadership”. He argued that, “serving people’s needs creates the image of being slavish or subservient… The principles of servant leadership are admirable. It is the image of servant with a slave-like connotation that is problematic.”

But, it need not be so. Greenleaf didn’t call leaders to carry their subordinates’ briefcases or stand up when a line officer enters the office. Servant-leadership doesn’t mean nobody is in charge. It doesn’t seek to blur the distinction between the leader and the led. But, it does recommend that the leader conducts business with the people’s all-round welfare in mind.
In fact, one of the critics concedes that servant leadership “forces us away from self-serving, domineering leadership and makes those in charge think harder about how to respect, value and motivate people reporting to them.” According to Larry Spears, “At its core, servant leadership is a long-term, transformational approach to life and work … that has the potential for creating positive change throughout our society”.

This leadership approach need not conflict with organisational goals as some critics may fear. In fact, most industrial crises can be traced to workers’ dissatisfaction triggered when leaders, in the haste to achieve a dream bottom-line, adopt policy that sets workers’ teeth on edge. Servant leadership can help prevent this. All that is needed to ensure that the approach works well is to harmonise organisational goals with workers’ real and felt needs. I will now outline the characteristics of the servant leader.

* Listening
Communication skills are imperative to effective leadership. But the servant-leader isn’t only a good speaker; he is also a good listener. For, he needs to hear the people’s views and feel their pulse to determine the direction the nation or organisation would go to realise its goals. Many an African political leader is reputed to be a bad listener; some are known to have developed a hard artery against expert advice of state officials and public opinion leaders; nor do they hear the cries of their people for attention. This is one reason why many African leaders fail. For, whether in a nation or an organisation, only leaders who listen intently to the people they are leading can have a better shot at satisfying both corporate and individual needs.

* Empathy
Another chief reason why many Indian leaders may fail to improve the state of their nations and people is their ack of empathy, because they live outside the people’s world. “The servant-leader, wrote Spears, “strives to understand and empathise with others”. The leader may not pander to the people’s whim; nonetheless, he views their intentions as good and values their feelings. But this won’t happen unless the leader chooses to wear the people’s shoes. A servant leader can’t live in a cocoon.
.The servan leader must be like the silkworm that emanates its days work without bothering about the result and benefactors.

http://cj.my/politickler/2010/02/25/have-you-heard-about-servant-leadership/ read this post to feel more..