Saturday, July 17, 2010

Enhancing Brains

What Are We Afraid Of?

Editor's note: In 2008, Henry T. Greely, a professor at Stanford Law School, co-authored a commentary in Nature; it concluded that "safe and effective cognitive enhancers will benefit both the individual and society." The article inspired an impressive number of responses from readers, and the debate has continued in scholarly journals and the mainstream media in the years following publication. Here Professor Greely builds on that momentum, arguing that only some concerns about cognitive enhancements are justified and proper attention is needed to address such issues. He contends that rather than banning cognitive enhancements, as some have suggested, we should determine rules for their use.

In December 2008, I was the first author on a paper in Nature called "Towards Responsible Use of Cognitive-Enhancing Drugs by the Healthy."1 We argued that there was nothing inherently wrong with the use of drugs for cognitive enhancement, although issues of safety, fairness, and coercion will require attention. I received far more communications about that article than about anything else I have ever written. About one-third of them were thoughtful responses, some in favor of cognitive enhancement and some opposed. Another third said, roughly, "How much crack were you smoking when you wrote that?" The last third said, also roughly, "How much money did large drug companies pay you to write that?" (I kept waiting for "How much crack did large drug companies give you to write that?" but, alas, that question never came.) In spite of what some of my correspondents seemed to think, the article had not called for putting stimulants into the water supply. We thought we were taking an open-minded but cautious approach to the issue. So, what prompted this strong response and what, if anything, can we learn from it?

Probing that question is my ultimate aim in this article, but we will get there somewhat indirectly. I will first make an affirmative argument for cognitive enhancement through drugs or other neuroscientific interventions. Then I will talk about concerns, both appropriate and inappropriate, about these kinds of enhancements. Only then will I try to understand the strong negative reactions to our paper and what we might learn from them.

Note first, though, that this is largely a hypothetical discussion, as cognitive enhancement remains largely in the future. Current, direct brain interventions for the purposes of cognitive enhancement are few and far between, comprising mainly a few stimulant drugs of unclear (but clearly not enormous) effectiveness and uncertain risks. Yet the explosion of our knowledge of the human brain—and the scores of billions of dollars being invested in discovering treatments for brain-based ailments, some of which are likely to spin off enhancement products—convince me that these issues will be substantial, perhaps in this decade and likely in the next.

Why Cognitive Enhancement — And What Kinds Of Enhancements?

We want to enhance our brain for the same reason we want to enhance anything else: to make it work better. At the risk of tautology, enhancing something means making it better. The current controversy is about what I call direct biological cognitive enhancements—chemical, physical, or electromagnetic intrusions into our physical brains. Drugs certainly qualify, but so do brain surgery, direct brain stimulation, microelectrode insertions, transcranial magnetic stimulation, and a variety of other possible interventions made newly possible, or newly plausible, by advances in neuroscience. All were developed for therapeutic purposes, but many have potential uses for enhancement.

Of course, we have been engaged in less-direct cognitive enhancement for a long, long time. Language may have been our first crucial cognitive enhancement as a species. A few millennia ago, we added writing, an enhancement that dramatically improved our cognitive abilities to remember, to learn, and to communicate. The list stretches to the Internet and beyond.

Some might object that these are tools, not enhancements. But our tools are enhancements. Like everything else we do, tool use changes our brains. Literacy changes how our brains function and the physical layout of our synapses and circuits; so, no doubt, does Google. Although it may be useful to distinguish between "tool enhancements" and "direct brain enhancements", we always need ask why—and whether—it matters if we improve our brains through a keyboard or by drugs, deep brain stimulation, or neurosurgery.2

What Should We Worry About?

Many people find direct brain enhancement frightening. There are some good reasons for concern, though not, I think, for fear; other worries about these technologies are unsubstantiated. The three issues I worry about are safety, fairness, and coercion.

We have at least some vague idea of the risks of existing enhancement technologies, not from large, systematic studies, which don't exist, but because we have seen them in widespread use. Furthermore, tool enhancements seem less likely than direct enhancements to have dangerous effects on the brain. An enhancement that works through the visual system, for example, is unlikely to pose substantial new kinds of safety risks (although there may be exceptions, such as flashing patterns that might trigger epileptic seizures). Given the awesome complexity of the human brain and our still very limited understanding of it, we should be worried about the effects of new drugs or new methods of brain stimulation or surgery designed to enhance the brain.

The way such enhancements are likely to be regulated should magnify our concern. I expect most of these enhancing technologies to be developed not for the purpose of enhancement but as treatments for illness or deterioration. In the United States, drugs, biological products, and most high-risk medical devices can be sold only after the Food and Drug Administration (FDA) has been convinced that they are both safe and effective. The FDA, however, makes that decision in the context of one particular proposed use. This makes perfect sense—a drug that instantly cures half of the people treated and quickly but painlessly kills the other half would be seen as acceptably safe and effective as a treatment for metastatic pancreatic cancer, but not for treating acne.

Once the FDA approves a new drug, biologic, or medical device, a physician may legally prescribe it for any use, even though that use may never have been proven safe or effective. This approach to "off-label use" will mean that, at least as far as the FDA is concerned, approved drugs, biologics, and devices can be used for enhancement purposes where neither the safety nor the efficacy is known and where the balance of benefit and risk may be quite different. (Professional standards and potential malpractice liability do provide some constraints.)

Even worse, some non-drug methods of cognitive enhancement will receive no FDA review. The FDA has no jurisdiction over new surgical techniques using approved drugs and devices; no one has to prove in advance that these techniques are safe and effective. Medical devices that the FDA does not consider high-risk, or that are "substantially similar" to existing devices, get only minimal review. Dietary supplements are almost completely unreviewed when they are purported only to affect the "structure or function" of the body—as would be the case with cognitive enhancements—and not to treat disease.

Off-label use is already a problem. Amphetamine and dextroamphetamine (Adderall) and methylphenidate (Ritalin) are drugs prescribed to millions of Americans to treat attention deficit disorder (ADD) or attention-deficit/hyperactivity disorder (ADHD). They are also widely used on college and high school campuses as "study drugs" to help students without an ADD or ADHD diagnosis fight off sleep and focus better on their work. A survey in 2008 showed that four percent of 1,800 randomly surveyed students at a large public university had prescriptions for Adderall or Ritalin; another 34 percent of the students (including more than half of the juniors and seniors) had used the drugs without a prescription, almost all of whom said they took it to help them study.3 Yet we have few, if any, good studies of the safety and efficacy of these drugs when students use them without a prescription to try to improve their educations (or at least their grades).

Future cognitive enhancements will only make this problem worse. Like any interventions, they will have varying risks and benefits, yet our current regulatory scheme would require no testing for some kinds and very little testing for others. Even for drugs, the most regulated kind of enhancement, testing would cover only therapeutic uses in people who are ill, not their enhancing uses in people who are healthy.

What can we handle this problem? We could put our hopes in malpractice liability to limit prescriptions (though that will not prevent patients from reselling the drugs or giving them to their friends). We could try, with scant chances of success, to ban the use of all cognitive enhancements because of safety concerns. Or we could require better research on, and better regulation of, cognitive enhancements.

Fairness concerns also demand attention. New, effective cognitive enhancements could add to existing questions. Is it fair for one student to take an exam after studying all night using Adderall when another student has not taken the drug? Is it fair for one student to take the exam after nighttime studying fueled by double espressos or energy drinks? Is it fair for one student to take the test after receiving tutoring that another could not afford, or after using a computer or a computer program that the other did not have?

There is a broader question of fairness here. If direct cognitive enhancements really work, and if they are expensive, presumably only the rich will have access to them. This is the new face of an old concern. The single greatest current cognitive enhancing technology is primary education, particularly literacy and arithmetic. Not long ago, even in rich countries, primary education was generally unavailable to the children of the poor. Now almost every country makes free primary (and usually secondary, and sometimes tertiary) education available.

Cognitive enhancements need not lead to unfairness. If limited access to effective cognitive enhancement is a problem, we probably could manage it much better by making enhancements available to everyone than by trying, probably unsuccessfully, to prohibit them to anyone. If we felt that this remained unfair to those who chose not to use enhancements, teachers could institute different grading curves for "enhanced" and "unenhanced" students. And note that if students do learn better using direct brain enhancements with low risk, both they and the world are presumably better off because they are better educated.

Finally, we must consider coercion. Should we allow people to be forced to undergo direct cognitive enhancement? The example of free education suggests one answer: Sometimes we should. Countries make primary education not only universally accessible but universally compulsory. Some enhancements might be so safe and so powerful that, like reading, writing, and arithmetic, they should be required.

Short of that, though, should we put limitations on coercion? Employers often force employees to attend workshops or take courses to improve the employees' performance, without any real evidence that such interventions are either safe or effective. Should an employer be allowed to say, "Take these memory-improving pills or you will be fired"? Should the military be able to say, "Take these alertness-improving pills or you will be court-martialed"?

Most difficult, should parents be able to coerce their children to use direct brain enhancements? We give parents very broad scope in decisions about raising their children; Supreme Court has ruled that the Constitution protects parents' rights to make some such decisions. Yet that discretion is not endless; at some point, child protective services can be called and parental discretion overruled. Where should we draw the line with parents seeking to improve their children's brains?

These questions about coercion, like those regarding safety and fairness, do not lend themselves to definitive answers. Good answers will doubtless depend on the enhancement technology and on the social circumstances. But we need to begin to come up with answers, and soon.

What Shouldn't We Worry About?

There are at least three unsound reasons for concern: cheating, solidarity, and naturalness.

Many people find the assertion that enhancement is cheating to be convincing. Sometimes it is: If rules or laws ban an enhancement, then using it is cheating. But that does not help in situations where there are no rules or the rules are still being determined. The problem with viewing enhancements as cheating is that enhancements, broadly defined, are ubiquitous. If taking a cognitive-enhancement drug before a college entrance exam is cheating, what about taking a prep course? Using a computer program for test preparation? Reading a book about taking the test? Drinking a cup of coffee the morning of the test? Getting a good night's sleep before the test? To say that direct brain enhancement is inherently cheating is to require a standard of what the "right" competition is. What would be the generally accepted standard in our complex and only somewhat meritocratic society?

The idea of enhancement as cheating is also related to the idea that enhancement replaces effort. Yet the plausible cognitive enhancements would not eliminate the need to study; they would just make studying more effective. In any event, we do not reward effort, we reward success. People with naturally good memories have advantages over others in organic chemistry exams, but they did not work for that good memory.

Some argue that enhancement is unnatural and threatens to take us beyond our humanity. This argument, too, suffers from a major problem. All of our civilization is unnatural. A fair speaker could not fly across a continent, take a taxi to an air-conditioned auditorium, and give a microphone-assisted PowerPoint presentation decrying enhancement as unnatural without either a sense of humor or a good argument for why these enhancements are different. Because they change our physical bodies? So do medicine, good food, clothing, and a hundred other unnatural changes. Because they change our brains? So does education. What argument justifies drawing the line here and not there? A strong naturalness argument against direct brain enhancements, in particular, has not been—and I think cannot be—made. Humans have constantly been changing our world and ourselves, sometimes for better and sometimes for worse. A golden age of unenhanced naturalness is a myth, not an argument.

Why Do People Care So Much?

So why did I get all those odd e-mails? Why do people care so much about this? I do not have any good social-science data, but I want to suggest some ideas that seem plausible.

First, the cognitive enhancements we have today are stimulants. Some stimulants are drugs of abuse, illegally used to get high, and these can be very risky. Adderall is a combination of several kinds of amphetamines; other enhancing stimulants similarly provide alertness and attention. The currently available direct brain enhancements are not only weak and, when used off-label, of uncertain safety, but they are a socially stigmatized kind of product. I suspect that people will be more accepting when the cognitive enhancer is a drug that boosts memory, or a little electrical stimulator that sticks to the scalp.

Second, some people compare cognitive enhancement to steroids, human growth hormone, and "blood doping" in sports. Just as the latter are wrong, they urge, so is the former. It is not at all clear to me that enhancement in sports is wrong, except to the extent that it is against the rules of particular sports and breaking the rules is wrong. Top-level athletes are enhanced by training, conditioning, equipment, nutrition, coaching, and psychological counseling, among other things. The case for singling out performance-enhancing drugs as the bad enhancement in sports has the same problems as the case against cognitive enhancement.

But I do want to suggest two ways sports are different. First, the competition is more direct. One wrestler will win the gold medal, the other the silver. Perhaps in that context fairness requires more evenness—though, of course, that does not tell us whether the competitors should be equally pure or equally enhanced. Second, although I love them dearly, sports are entertainment. The world is not better or worse depending on who wins an Olympic gold medal, the Super Bowl, or the World Cup. The world may well be better off if more brains are enhanced, if more people are learning and thinking more effectively. Granted, it is an unprovable assumption—perhaps just a bias—that a world with smarter people would be a better world, but in some areas, such as biomedical research, the assumption seems fair. Better medical treatments, developed and delivered sooner, relieve human suffering and improve the world. A new weight-lifting record does not.

Third, some people feel threatened by cognitive enhancements. They fear that, in order to compete in their world, they will need to add enhancing drugs to 24/7 smart-phone access, 80-hour workweeks, and disappearing vacations. This is not an irrational fear, but it is probably not widespread. Most Americans do not work 80-hour weeks, constantly searching for the next competitive advantage. Most of the country has found comfortable, or at least acceptable, compromises between work and the rest of life. For some, cognitive enhancement may not be required in order to run faster just to stay in place. It may be a way to do the same, fixed amount of work in less time, providing more free time, not less. For driven people, the problem is not in their tools but in themselves. Safe and effective cognitive enhancement might add yet one more pathway to unhappiness, but it will not cause unhappiness.

Fourth, the kinds of enhancements we are talking about are, for the most part, new. One of the exceptions is revealing. Few people worry much about using caffeine, yet it is a cognitive-enhancement drug that comes with risks. In a large enough dose, it can even be fatal. (The fatal dose would require drinking about a hundred cups of coffee in a short time period, though deaths have been reported from an overdose of caffeine pills.) We are more comfortable with longstanding enhancements than with newer ones. Only part of that comfort is that we are more aware of the benefits and the risks of the older enhancements and the ways they are used. Even if a new, direct brain enhancement were demonstrated beyond a doubt to be safe and effective, its newness would evoke discomfort.

Finally, some people worry about where this will all end, projecting a path from Adderall to a human/computer cyborg. We cannot, of course, confidently predict where these technologies will lead. We can see blurred visions of the next decade or two; we can see almost nothing at all about the next century, let alone the next millennium. Mankind may change dramatically and cognitive enhancements may turn out to play a crucial role in that transformation. Or maybe not. I would suggest only that if we should develop safe and effective direct brain enhancements, we should not reject them for fear of where they may lead in a distant future. Future applications will be the problems, and the decisions, of our grandchildren and their grandchildren, who will have the benefit of more knowledge both of the technologies involved and of their culture's views of those technologies. For us to think that we can, today, make better choices for them based on almost no information about the questions they will face is hubris.

In Conclusion – It Depends

As in the Nature article, I have not argued here that direct brain enhancements are good, let alone that they should be added to the water supply. I have argued that they are not necessarily bad. Their appropriate use will depend on their safety and effectiveness, along with how we choose to use them and what steps we take to mitigate the challenges to fairness they may pose or the invasions of individual autonomy they may provoke.

Biomedicine will be creating more and more products and procedures that can be used for cognitive enhancement. Some of them will be used in ways that will, on balance, improve human life and society. At the same time, I worry that they may be used in harmful ways. I am confident, though, that a knee-jerk rejection of all direct brain enhancements will be at least a missed opportunity and at worst an opening for a damaging underground and uncontrolled world of enhancements. In order to maximize the benefits and minimize the harms of these new technologies, we will need to look at particular enhancements rationally and to adopt, ban, or regulate them carefully. On this, much depends.

By Henry T. Greely, J.D. / July 14, 2010

References

1. H. T. Greely, B. Sahakian, J. Harris, R. Kessler, M. Gazzaniga, P. Campbell, and M. Farah, "Towards Responsible Use of Cognitive-Enhancing Drugs by the Healthy," Nature 456 (2008): 702–705.

2. I have written more about tools as enhancements in two places: H. T. Greely, "Remarks on Human Biological Enhancement," University of Kansas Law Review 56 (2008): 1139–1157; and H. T. Greely, "Regulating Human Biological Enhancements: Questionable Justifications and International Complications," The Mind, The Body and The Law: University of Technology Sydney Law Review 7 (2005): 87–110 / Santa Clara Journal of International Law 4 (2006): 87–110 (joint issue).

3. A. D. DeSantis, E. M. Webb, and S. M. Naor, "Illicit Use of Prescription ADHD Medications on a College Campus: A Multimethodological Approach," Journal of American College Health 57 (2008): 315–323.

About Henry T. Greely, J.D.

Henry T. Greely, J.D., is the Deane F. and Kate Edelman Johnson Professor of Law and professor (by courtesy) of genetics at Stanford University. He specializes in ethical, legal, and social issues arising from advances in the biosciences, including neuroscience, genetics, and human stem cell research. He chairs the California Advisory Committee on Human Stem Cell Research and the steering committee of the Stanford University Center for Biomedical Ethics, and he directs the Stanford Center for Law and the Biosciences. From 2007 to 2010 he was a co-director of the Law and Neuroscience Project. In 2006, he was elected a fellow of the American Association for the Advancement of Science. 

Sunday, July 11, 2010

Five Practices To Encourage Employee Empowerment

New Bersin & Associates research shows a direct positive correlation between learning culture (the collective practices that encourage and enable open sharing of information and all forms of employee development) and business outcomes. The same research shows that most companies do not capitalize on this potential advantage.

Based on responses from 426 organizations, the study, "High-Impact Learning Culture," identifies 40 specific practices that have the greatest business impact, based on correlation to 10 business outcomes: customer input, customer responsiveness, customer satisfaction, innovation, employee productivity, workforce expertise, time to market, market share, cost structure, and learning agility.

The five practices that have the greatest business impact are:

  1. Leaders are open to hearing the truth, including bad news.
  2. Asking questions is encouraged of employees at all levels of the organization.
  3. Decision-making processes are clearly defined throughout the company
  4. Employees frequently are given tasks or projects beyond their current knowledge or skill level to stretch them developmentally
  5. Employees have influence over which job tasks are assigned to them.

Of the top 10 practices, companies struggle with this one the most. Only 23 percent list it as a strength.

Thanks to Inside Training / Lakewood Media Group LLC

Wednesday, July 7, 2010

Lock In Best Lockout Practices

A University of Vermont—hosted safety site has published what it calls the "Fatal 5"—the primary causes of LO/TO-preventable injuries. Make sure these hazards aren't present in your workplace.

OSHA requires you to train employees to prevent lockout/tagout (LO/TO) accidents in the workplace. Have you explained how to avoid the "Fatal 5" to your employees?

1 .   Failure To Stop Equipment. Sure, this sounds like common sense, but there's much more involved. Some workers value productivity above safety and others feel that their age or experience with equipment make them immune from risk. "Taking the trouble" to properly safeguard energized equipment is essential in all cases.

2.    Failure To Disconnect From the Power Source. When working with and around electric equipment, some workers believe that simply operating the on/off switch will ensure their safety. They ignore the fact that the switch may be defective or that power may find its way through a short circuit or other source.

3   Failure To Drain Residual Energy. There's a reason that televisions carry warnings about trying to open the case even if the set is disconnected. That's because many electrical devices store power in a capacitor or battery. Even unplugged, the risk remains. A compressed spring, hot pipe, pressurized tank, or heavy object hanging overhead can store energy even when the initial source of power is disconnected.

4.   Accidental Restart Of Machinery. Even if an employee knows how to shut down equipment before working on it, his or her co-workers may not. In too many instances, unknowing employees cause injury to their co-workers.

5.   Failure To Clear Work Areas Before Restarting. Restarting machinery must be performed as carefully as shutting it down and locking it out. A repair tool left in the works can fly out, or a restart while a co-worker remains in the path of danger represents as great a hazard as not locking out the machine at all.

Thanks to BLR Safety Daily Advisor

Tuesday, July 6, 2010

Hardwired Humans and...Talent Planning

For the last 20 years I've watched a crop of young people identified in the late 1980s as 'high potential leaders' in IBM Australia develop into top business leaders. One of the crop became a global executive with Microsoft, one is currently the CEO of IBM Australia and one is the CEO of Australia's largest telco. Back then we worked on a rule of thumb that it took around 20 years to grow a senior executive, so about 5% of young staff were identified and developed to provide the pipeline of leadership talent for the next generation. With such a good track record, were we gifted at spotting and developing talent? Or is there another explanation from the human instinct of classifying that more fully explains what unfolded and that we should incorporate into talent planning in organisations?
 
Rats In the Lab
Like all good lines of research, investigation into the impact of a leader's expectations on performance started with rats in a laboratory! In an early study back in 1964, one group of unsuspecting laboratory students was told that the rats they were studying were bred for maze-brightness (and thus this group had high expectations). A second group of lab students was led to believe that their rats were bred for maze-dullness (and hence had low expectations). The rats were in fact assigned randomly to the students. Well, in a sobering result for sophisticated talent planning, in the groups where expectations were high rats ran faster than in groups where expectations were low!
 
Kids In Schools
From the rat laboratory it was time to take the research to school classrooms. Students in 18 elementary (infants) classrooms were tested using a non-verbal IQ test. Twenty percent of the students in each class were then randomly labelled as 'intellectual bloomers', the workplace equivalent of 'high performers'. Teachers were told that students with this classification were expected to improve markedly in comparison to other students. Eight months later the tests were re-administered. Those students who were labelled as intellectual bloomers improved significantly more than the students who were not given this label. (Reference: N Kierein and M Gold, "Pygmalion in work organizations: a meta-analysis," in Journal of Organizational Behavior, 21, 913-928 (2000)).

In early high school I had a personal experience of the implications of a positive expectation from a school teacher. At the time I was neither enjoying nor doing well in history. In assessing a piece of homework, my teacher at the time, Mr Fisher, said to me, "Andrew, you're good at history!" Well, from that moment on I both enjoyed and did well at history! Thanks, Mr Fisher.
 
Adults At Work
The next step on the path of studying the impact of expectations on performance was to study adults in the real world. The setting was an Israeli Defence Force training camp. The course was an intensive one involving an average of 16 hours of instructor-trainee contact daily for 15 weeks.

Before the 115 soldiers entered the camp they were tested by the researchers on a range of capabilities. The soldiers were then randomly assigned to three categories of 'command potential' – 'high', 'medium' and 'unknown'. The researchers created this third category of 'not yet classified' to add credibility to the process in the minds of the instructors and to create an impression that there was not yet enough information about these trainees.

Four days before the trainees arrived at the training camp the instructors (leaders) were provided (mis)information about the trainees. The leaders were advised of the trainees' command potential, hence creating a performance expectation in the minds of the instructors. The instructors didn't know that the classification was entirely random. Would the expectation become self-fulfilling?

The potential impact of the expectation on trainee performance was measured in three ways. The first was learning performance including knowledge of combat tactics, topography, standard operating procedures and practical skills like navigation and accuracy of weapon firing. The second was attitudinal: how much the trainee desired to go on the next course, the extent to which the trainee would recommend the course to friends and their overall satisfaction with the course. The third dimension was the leadership perceptions trainees had of their instructors.

What impact did the setting of expectations have over the 15 weeks? The results showed a substantial effect on all three dimensions. The expectancy on trainees explained 73% of the variance in performance, 66% in attitudes and 28% in leadership. Trainees whose instructors were led to expect more did indeed learn more. Trainees of whom more was expected responded with more favourable attitudes toward the course. And trainees expected to do well by the instructors had a more positive impression of the instructors' behaviour. 'High command potential' trainees did better than the 'unclassified' group who in turn did better than the 'medium potential' trainees.

After the course the instructors were debriefed on the study. "The expectancy induction was so effective that it was difficult to convince the instructors that it had been random," noted the researchers and concluded that "managers get the performance they expect." (Reference: D Eden and A Shani, "Pygmalion Goes to Boot Camp: Expectancy, Leadership and Trainee Performance," in Journal of Applied Psychology, 1982, Vol 67, No 2, 194-199).
 
Implications For Managers

Implications of self-fulfilling expectations exist at both the manager level and the level of our talent planning systems in organisations.

For managers, if you have high expectations of your people:
1. Their performance will be higher (and vice versa if you have low expectations),
2. They will enjoy work more, and
3. They will think more highly of you (independent of your actual leadership ability!).

While in a practical sense you might relate better and have a greater regard for some your of team than for others, this inclination should be contained. If you show a greater regard for some then while that might lift the performance of those few it will likely diminish the performance of others. So for managers the tip is to have and show confidence in each and every team member. Your people will surprise themselves by doing things that they never thought possible. 

 

Implications For Organisations
At the organisational level, the fashionable A, B, C potential grid, literally putting people into boxes, is self-destructive. Given that few people are assessed as 'As', the system means that we are destined to have most of our people (the 'Bs') be less effective in their roles, be less engaged and to think less of their managers.

It leaves talent planning as a dilemma. On the one hand organisations need to grow future generations of leaders. But on the other hand, once the 'high potential leaders' are identified there is a high chance those individuals will become the future leaders for no other reason than they were expected to do so. The solution may be a) that young people all be provided the opportunity to be a high potential if they have a desire to be so, b) to provide all self-nominated people similar development opportunities initially and c) that people progress based on actual demonstrated performance in different roles. Given we have around 20 years to grow a graduate into a senior executive we have time on our hands to err on the side of a wide catchment and hold off funnelling into a select few for some years.

So, the answer to the IBM question of whether or not we were gifted at spotting talent is most likely a case of the power of classifying. Certain individuals, once classified as high potentials, were likely to fulfil that expectation. It also means that there were probably many others not so classified early in their careers who did not progress as far for no other reason than that they were not expected to do so.


Thanks to Andrew O'Keeffe / Hardwired Humans 

How to Banish Bad Habits and Control Temptations

Psychological Research Suggests Bad Habits Can Be Controlled By Vigilant Monitoring.

Anyone who has ever found themselves trying to turn on the bathroom light seconds after phoning  the power company to ask how long the power cut will last, knows how easily habits bypass our conscious thought processes.

Part of the reason habits are so difficult to change is they are triggered unconsciously, often by situations we've encountered time and time again. Before going into the bathroom: turn on the light. After getting new email: waste 10 minutes aimlessly surfing the web.

Temptations, on the other hand, play more on visceral factors like hunger, sex or thirst. We see a muffin and can't resist.

New research published in the journal Personality and Social Psychology Bulletin by Quinn et al. (2010) suggests a different strategy for changing a bad habit than for resisting a temptation.

"Don't Do It!"

First, though, the researchers wanted to find out what habit-control strategies people use in everyday life. Ninety-nine students kept diaries of their battles with bad habits and temptations. Over 7 or 14 days they recorded each time they felt like giving in to a temptation or a bad habit they were trying to get rid of.

Top of the list for unwanted activities were excess sleeping, eating and procrastination (no big surprises there in a sample of students). The top strategies to combat these were:

  • Vigilant Monitoring: watching out for slip-ups and saying "Don't do it!" to yourself.
  • Distraction: trying to think about something else.
  • Stimulus Control: removing the opportunity to perform the habit, say by leaving the bar, fast-food restaurant or electronics store.

For strong habits it was the vigilant monitoring that emerged from self-reports as the most useful strategy, with distraction in second place. While for strong temptations rather than habits, participants reported that stimulus control was the most effective strategy while monitoring dropped to third place behind distraction.

For both weak habits and weak temptations the strategy used mattered less, although for weak temptations the monitoring strategy emerged as the best.

How To Defy A Bad Habit

As you'll be gathering from reading PsyBlog, though, psychologists are suspicious of what people say. Instead they like experiments to see what people do. So, in a second study they used a lab-based analogue of real life, to see if vigilant monitoring really is an effective strategy for controlling strong habits.

Sixty-five participants learned one response to a word, then in a second study had to change this response in defiance of the habit they'd built up.

Backing up the first study, the experiment found that vigilant monitoring was the most successful short-term strategy for suppressing a strong habit. Once again for weak habits the type of strategy used made little difference.

Habits Versus Temptations

So, why does vigilant monitoring work for habits but not for temptations?  Quinn et al. argue that it doesn't work for temptations because watching out for slip-ups heightens our attention to the temptation which we are, ironically, once again tempted by. Stimulus control, though removes the opportunity: out of sight, out of mind.

Unlike temptations, habits are learnt by repetition and so they can sneak in under the radar. We find ourselves repeating them without thinking. Vigilant monitoring probably works because it helps us notice the habit and remember that we wanted to change it.

The Bad News

But, as anyone who has ever tried to change a long-held habit will know, continually monitoring for bad habits is tiring and some days your self-control is weaker than others.

This isn't helped by what are known as 'ironic processes of control' which I cover in my series '10 more brilliant social psychology studies'. This is the idea that monitoring a thought in the hope of getting rid of it only makes that thought come back stronger.

In the long-term it may be necessary to try and replace the old habit with a new one. Unfortunately this new habit is likely to be much more unstable than the old one.

I'd like to leave you with better news but sometimes it's good to know the worst. We are often slaves to our habits and many of these habits are extremely hard to change because they are triggered outside our conscious awareness. Anyone who tells you different is either lying to themselves or trying to sell you a quick-fix that probably won't work.

Thanks to PsyBlog

Saturday, July 3, 2010

Important Questions To Ask The Employer During An Interview

The interview session is the most important part of job hunting. You are grilled by the interviewee relentlessly and you are so nervous that you might give the wrong answer that you can hardly think of anything else during the interview. As a result, when the employer asks if you have any questions at the end of an interview session, most of us are so glad that the grilling is about to end that we respond in negative. This is one of the gravest and the biggest mistake one can make and it can cost you the job. If you do not ask any questions, the employer perceives that you are not entirely acquainted with the company and the various aspects of job requirements, which is why you prefer not to ask any questions lest it should reveal that you came to interview ill prepared. This is precisely why whenever you show up for a job interview, you must research the company profile and the subsequent job requirements so that you are all set to ask intelligent questions at the end of the session.
 
Questions You Must Ask:

If you are one of those people who fail to come up with sharp and intelligent questions to pose in an interview session, your worrying days are over. We have compiled a list of questions that every candidate should ask in a job interview and can be generalized for any kind of job. Employers expect to hear these questions from the interviewee and if you even ask some of these, there are high chances that you will leave a favorable impression on the employer.

 

What Will Be My Core Responsibilities For This Job?

This is a very important question which shows the employer that you are actually interested in the job and want to know the details of the job responsibilities. If the employer chooses you for the job, posing this question and getting the required answer will help you a lot in performing well. In order to perform well at you job, you must have a general idea of what the employers expect from you and what you must do to make sure that they do not regret hiring you. Moreover asking the employer about the core responsibilities will give you a clearer idea of what the position entails and if you are not so sure about whether you are well suited for the particular job or not, the answer to this question can help you in deciding.

 

What Is The Company's Organizational Structure?

Getting knowledge about the prospective employer's company structure can give you an idea about the corporate culture at the organization and can help you in better understanding where you would fit in the company if given the job. Every company has a different organizational structure and asking this important question will help in clearing a lot of things for you.

 

How Much Learning Opportunity Does Your Company Has To Offer?

Asking such question shows the employer that you are a serious candidate who is not only interested in the job at hand but also wished to polish his skills by continuously learning new things. A candidate who is not interested in learning does not appeal much to the employer. Also as a professional it is important that you know what learning opportunities are available at the company so that you can avail them once you get selected for the job.

 

How Often Does Your Company Carry Out A Performance Review And What Is The Basis For The Review?

Most companies have a process for performance review and they will be able to give you the exact requirement and qualities needed for a favorable performance evaluation. Others might not have an outlined procedure and it might be loosely based on the manager's whim. In such cases the performance appraisals get tricky as a lot of office politics comes into play. So before you take on the job make sure that you have gathered all the facts and you know what you are getting into.

 

Can You Describe The Ideal Employee For The Given Job?

Asking what the employer perceives as an ideal employee will give you an idea of what he expects from you once you are hired. This will in turn help you in performing your job efficiently and effectively. Moreover, it will enable you to judge if you will be able to keep up with the expectations of the employer or do you feel like you need to work harder or apply for another job altogether.

 

Who Will I Be Working For?

You should get a general idea of who will be the manager and what kind of people will be your team members. You should also ask how many people will be assigned to work under you if you are applying for a managerial position.

 

Does The Company Offer Educational Benefits?

Some companies do offer educational benefits to the employees in order to polish their skills while they work for the organization. Such educational opportunities can help a lot in boosting your career so you should ask if there are any such prospects and what are the criterions for availing these opportunities.

 

What Are The Prospects Of Moving To Different Departments Within The Company?

This question is important to determine if there are chances of growth and rising to higher ranks within the company. Most companies encourage moving between departments within the organization and consider it a positive thing. This can also be good for the employee as moving within departments ensures that he gets a diverse experience while working for the same organization.

 

What Kind Of Training Will Be Available To Me?

Some companies offer training sessions to the employees on regular basis and the employees learn a lot of valuable things during these sessions. You should ask this question to get acquainted with the kind of training sessions that are available at the company and what benefits will come of these sessions.

 

What Are The Next Steps In The Interview Process?

You should ask this question so that you know if there are going to be follow up interview, how many and with whom. This will also give you a general idea of when to expect to hear from the employer if you are selected for the job in question.

 

If I Am Offered The Position, How Soon Will I Be Expected To Respond And Join?

You should know when you will be expected to join so that you can tell the employer if you need to give a notice period to the previous employer and give a date that would be suitable for you to join the company.

 

Important Tips:

Make sure that you follow the following tips while framing the questions:

  • Avoid questions focusing on your leaves, pay day and vacations as the employer will inform you of the company policy in the offer letter and putting these questions before hand makes you appear like someone who is only in it for the benefits and cares little about the job.
  • Refrain from asking questions that are answered on the company's website or are too obvious as they reveal lack of maturity on your end. Research the company well before going to the interview.
  • Some questions might be answered during the course of the interview and if in the end all your questions have been answered, rather than repeating them just tell the employer what you wanted to ask and that the queries have been answered during the interview session.

Conclusion:

You have to keep in mind that an interview session does not has to be one sided. You too can ask questions and it will create a better impression if you do ask questions rather than saying nothing. However asking too many questions or unintelligent questions can also ruin your interview. So be well prepared and ask the right questions to make a lasting impression on the employer during the first interview.

Thanks to Rozee Pk

Study: Qigong and Tai Chi Demonstrate Health Benefits

A review study published in the current issue of the American Journal of Health Promotion states that Qigong and Tai Chi, exercises which originated from China centuries ago, have significant benefits for physical and mental health. Qigong is a term used to describe exercises which promote the flow of qi (pronounced "chee") or energy in the body. Tai Chi refers to very specific series of movements which require discipline, flexibility and muscle control to master. The authors of the study found that practice of these exercises improved cardiovascular health, immune function, mental health and overall quality of life. Practitioners also had improved balance and reduced risk of falls. While yoga is still a more popular form of low-impact exercise, more and more Americans are embracing Qigong and Tai Chi as alternatives. Coincidentally, I attended my first Tai Chi class last week. Believe me, it's harder than it looks and I definitely broke a sweat by the end of the hour.
 
Thanks to Daily Dose