Article on I/O Psychology in Saint Louis Post Dispatch

The Saint Louis Post Dispatch recently ran a story on Industrial/Organizational Psychologists, or “Work Psychologists” as they described us. They kind of spotlight a guy by the name of Joel Philo at Frito Lay (I’ve actually met Joel and found him pretty sharp myself) and overall the piece is a pretty good representation of what I/O Psychologists do. Here’s a quote:

“As talent becomes scarcer than capital, identifying, developing and retaining that talent becomes a top priority,” [Philo] said.

“People in my field are specialists in such talent management and can use our knowledge of statistics, proper research design and psychology — especially around motivation, individual differences, leadership and social psychology — to contribute verifiable value to the bottom line.”

Workplace psychologists facilitate many tasks, including promotion, computer-based learning and team design. They also guide organizations in mergers and acquisitions, human resource management and statistical analysis.

Yep, pretty much. What cracked me up about this piece, though, was how they couldn’t resist punching it up a bit towards the end with a kind of crime drama zest:

“We know how to go into an organization, diagnose a problem and study it scientifically so that the ultimate solution is based on data and not short-term, superficial analysis. I-O psychologists are deep thinkers,” said Wendy S. Becker…

One of her projects involves crime labs. While technology such as DNA testing and fingerprint databases can help solve more crimes quickly, these labs are hampered by their inability to attract, develop and retain forensic scientists, said Becker, an assistant professor of management at the State University of New York at Albany.

“There is a national case backlog — crimes not being solved — because of the need for trained employees,” she said. “

To assist labs in getting over these humps, Becker documents employee issues, creates surveys and designs performance measures. She also aids labs with accreditation.

I would TOTALLY watch a CSI-esque show about I/O Psychologists that diagnose and solve organizational problems. How fun would it be to see that crazy camera work zooming in on a nonsignificant correlation coefficient, accompanied by a dramatic “DONG-DONK!” sound? Or have a sepia-toned flashback to an ineffective meeting by a dysfunctional team? Or have an hard boiled I/O Psychologist (maybe played by Rick Schroeder) confronting a middle manager over his failure to close feedback loops or evaluate training effectiveness? I would totally have my TiVo record that and consider watching it when there was nothing better on. Totally.

Comments System Down

I’ve taken the comments system down due to too much comment spam. It got to the point where I was getting like 100 comment spams a day. I’ll turn it back on once I upgrade to the newest version of Movable Type, which should be sometime this week.

Ten Ways to Improve SIOP

I’ve been to 11 of the SIOP conferences since 1843, and I’ve seen a lot of improvements in the program and the way things are run. That said, however, I think there’s still a lot of room for improvement. Here’s just a few ideas:
One: Podcasting of seminars. Sure, you can supposedly buy a CD with .mp3 recordings of all the seminars from a company called Blue Sky and SIOP is supposed to be streaming the content from their website (though it’s still not up at the time of this writing). But really. That’s so 2004. The kids today, they want their discussions of leadership development and meta analysis right away, and in a form they can just dump into their fancy iPods as they head out the door to their rave parties and ice cream socials. SIOP should be posting these things right away and making podcasts immediately available to all paid members.
Two: More mini workshops. There are some of these things that go on, but the entire nature of the conference seems to shy away from basic content like “Job Analysis 101” or “Implementing an online 360 survey 101” or the like. There’s plenty of heady academic research and reports on what some mega consulting agency or another was paid to do, but there need to be more one-off, simple, and imminently useable workshops on how to do simple stuff. Besides those expensive, pre-conference workshops. This would be a big draw to grad students and newly minted I/O professionals.
Three: More quality control, especially on symposiums. Okay, I think I can say this since I’ve been a SIOP reviewer and had SIOP submissions both rejected and accepted in the past. But really, there needs to be more QC here. A lot of the posters are kind of “meh” and obviously someone’s Master’s Thesis conducted on college sophomores in desperate need of extra credit, but they’re easy to ignore and I think the recent practice of displaying the best posters during evening receptions is a great idea. But symposiums aren’t really reviewed very much. Just proposals for symposiums, not the actual content, which too often doesn’t match up with what’s described in the program. So, SIOP, start demanding at least completed papers, and preferably completed presentations before these things are accepted.
Four: Make an online archive of presentations Preferably all SIOP presentations, but at least put up a section on the SIOP website where presenters who so wish can have their PowerPoint slides, posters, and other papers available for easy searching and downloading. I’m tired of hunting through the program for presenters’ e-mails so I can beg for a copy of their slides or paper. Put them up there next to those podcasts I was talking about. Now I know that there’s some concern about whether this constitutes a “publication” and some researchers wouldn’t want to jeopardize their chances of getting some form of their SIOP presentation published in a refereed journal. But for many (most?) presenters, this isn’t an issue. Let them choose whether to have it included.
Five: Move to a conference center, for crying out loud. I’ve never been to a SIOP where the hotel was ideally laid out to accommodate a conference like this, and I doubt I ever will. Conference centers exist specifically to accommodate this kind of thing –they’re big, well equipped, laid out in a manner that makes getting around easy, and they’re usually located in the heart of an area with plenty of accommodations. So move SIOP to one. I know this is kind of a pipe dream, though, from conversations with someone who has negotiated contracts for the conference. Hotels subsidize some of the conference costs in exchange for guaranteeing a certain number of guests for the hotel rooms. Without this discount, SIOP would be more expensive. But hey, I can still complain hope, right?
Six: Move away from the weekend. I know that SIOP has traditionally been over the weekend (plus Friday and pre-conference workshops on Thursday) in order to allow academics and students to attend while cancelling fewer classes, but come on. Giving up me weekend is tough. I would much rather give up my work week! Run the conference over the weekend or at least from Thursday to Saturday. And actually, I hear that this is going to happen starting in 2008! But I’m still going to complain until then, just on principle.
Okay, so there are six reasonable recommendations for improving SIOP. Now, for fun, here are some that are less reasonable, but which I would fully support.
Seven: Ditch the Plenary Session. Seriously, who goes to this? I mean, I know that hundreds of people go to it every year, but my point is that I never go to it. It’s a bunch of people I don’t know being given awards I didn’t know existed. None of it impacts me much. Instead, use the time to give everyone a free pony.
Eight: Whittle down the exhibitor hall. There are too many exhibitors in the exhibitor hall. I don’t think that’s what it’s for. Each day of the conference, attendees should be able to vote exhibitors out of the hall after having them complete difficult tasks like eating bug sandwiches or running a Denny’s for a day. Have Donald Trump do it. At the end of the conference, the last vendor standing should be awarded ALL contracts for ANY work that ANYONE needs done. Then it should be revealed that they’re a construction worker from Boise, Idaho.
Nine: Trapdoors, trapdoors, trapdoors! Those little “2 minutes” signs that symposium chairs hold up to curtail the ramblings of long-winded presenters work some of the time, but we need trap doors for the more egregious offenders. There should be a big golden rope with a tassel on the end that hangs down in front of the facilitator or discussant. If someone goes over their time limit, the facilitator gives it a yank. Problem solved and we can all get to the next coffee break before the grad students scarf up all the cookies.
Ten: Zero Tolerance for Correlation Matrices. If any presenters utter the words “Now, I know you can’t read this…” and include in their presentation a giant table full of tiny tiny numbers, they should be thrown into gladiatorial combat with one of the vendors from the exhibitor hall. To the death. And accompanied by the music from that one episode of Star Trek where Kirk fought against Spock.
So there you go, SIOP. Feel free to take any of these ideas and run with them.

Patent on Online Testing to be Re-examined

Here’s kind of an interesting story. A while back a company called (a.k.a., “Test Central Inc.”) filed a patent on online testing. The patent was so worded that it gave them the ability to shake down educational testers who offered tests over the internet, and could have been used against the users of online employment testers. In other words, “the extremely broad patent claims to cover almost all methods of online testing.” I wrote about the patent last year on my other blog.
Apparently an organization known as the Electronic Frontier Foundation (EFF) lobbied to have the patent re-examined due to its, you know, absurdity. Says the article:

EFF filed the reexamination request because the extremely broad patent claims to cover almost all methods of online testing. has used this patent to demand payments from universities with distance education programs that give tests online. But EFF, in conjunction with Theodore C. McCullough of the Lemaire Patent Law Firm, showed that was not the first to come up with this testing method — IntraLearn Software Corporation had been marketing an online test-taking system long before filed its patent request.

Pretty cool. Nothing has been revoked yet, though The U.S. Patent and Trademark Office (PTO) has supposedly revoked abour 7-% of the patents it takes the time to re-examine.a

Front loading testing into the applicant tracking process

Okay, last post about a specific program I saw at this year’s SIOP conference. This one was kind of interesting, though, because it dealt with a process that seems to me like it should be a lot more common than it is. In “Cutting Edge Selection: Turning Applicant Tracking into Talent Acquisition,” several companies talked about how they had front loaded one kind of testing or another into the application process. In other words, they immediately tested people through the Internet as part of the online application process. No screening, no scheduling, just WHAM! You’ve been tested.
I’ve often wondered why more companies don’t do this. For tests that are valid across a wide variety of jobs (e.g., cognitive ability tests and to a more debatable degree, some personality scales), you could just have everybody take them as part of the application. (See last week’s post on unproctored testing for thoughts on how to deal with cheating.) You don’t even have to score them, but you’ll have them on file and ready to go when you want. The same could be said for job-specific tests if you have a job with lots of applicant flow. The presenters at the SIOP symposium talked about how they used minimum job qualifications and situational judgment tests, but it could really be done with almost any test amenable to the web.
I guess there would be some reasons to be careful about this –if you use a vendor’s test and have to pay per administration then it could rack up pretty quickly, for example. And you’d need to think through what implications this has for your EEO statistics, especially since including tests earlier in your application process can increase adverse impact if test-takers would have otherwise been screened out. And these presenters were folks who were drowning in resumes –one of them reported getting fifty thousand resumes per month. But if your applicant pool is big enough and you own the tests so you’re just pushing bits around, then why not include a test of reasonable length right up front?

The Utility of Unproctored, Internet-Based Testing

I like the Internet and I think it’s going to be HUGE some day. So another one of the better SIOP symposiums that I saw this year was “Unproctored Internet Testing: What do the Data Say,” which dealt with unproctored, Internet-based, pre-employment testing. It’s been a hot topic for years now, but what was noteworthy about this talk was that it dealt with actual data from a number of different research projects. It was pretty interesting to see what actual data had to say on the topic, though some of the presentations elicited a few “well, duh” moments.
The short version of most of the presentations was that mean test scores don’t go up over time when you offer unproctored testing. I’m not really sure why this would be surprising if you’re offering non-cognitive tests like personality, biodata, or situational judgment tests where there’s no objectively correct answer that you can get from your calculator, a web search, or your nerdy roommate. And most unproctored testing programs these days seem to omit a cognitively loaded test out of (probably legitimate) fear of cheating.
My favorite part of this symposium, then, was when a couple of guys from Sprint and Previsor/ePredix used utility analysis to decloak the massive elephant in the room: when you drop cognitive ability tests so that you can go unproctored, the overall validity of your selection system suffers, as does the return on investment. The presenters made some modest estimates of the drop in validity (say 15% or 20%) and then calculated the utility for a proctored version of a test battery that included cognitive ability test, and an unproctored, Internet-based version without the cognitive test. Guess which came out ahead? Oh, I’ll tell you: the proctored one that had the cognitive ability test.
Now I know that utility analysis has its own problems. Mainly the wonky nature of SDy, the standard deviation of a worker’s productivity in dollars (I think that’s what it is, it’s been a while so somebody correct me if I’m wrong here). Even minor overestimates can result in utility values that rival the gross domestic product of, say, the Western hemisphere.
But given all else we know about cognitive ability testing, I believe the trend if not the specific numbers. I’m not sure I would ever recommend just yanking any cognitively-loaded tests if they’re valid and job related just so you can sell the idea of unproctored web-based testing more easily. A better approach that I’ve seen some companies adopt is to give the tests without proctoring, screen out the people who are neither capable enough to pass on their own nor unscrupulous enough to cheat. Sure, you may get some cheaters who would otherwise fail, but what you do is bring in the people who passed and test them again with an alternate form of the test in a proctored environment. The people who can’t pass without cheating get screened here. You’re testing fewer people overall on-site (and thus saving money), but you still get rid of the cheaters. I’d have liked to have seen a utility analysis of this approach, but unfortunately the presenters didn’t address it.

Using Video Games to Enhance Training Effectiveness

I admit, I have a fascination for the space where video games and I/O Psychology intersect (see here and here). That’s why there was one symposium at SIOP this year that I absolutely had to attend: Learn N’Play: Effectiveness of Videogame-Based Simulations for Training and Development.
It was comprised of two purely theoretical pieces and two empirical ones. The theoretical ones basically looked at where the nature of video games and training overlapped and identified a TON of questions that haven’t been answered by any quality research. Most of it centered around how games could be used to facilitate learning and transfer of training to the job. If there are any gamer graduate students out there, this area is ripe for the picking, and you would totally have an excuse for playing Counter-Strike when you should be working on your dissertation.
The empirical pieces focused on qualities of the video game and qualities of the player that might enhance or inhibit the learning that was supposed to take place. The findings weren’t exactly numerous nor shocking, but it’s nice to see someone study this stuff scientifically. One researcher found, for example, that if in-game information is presented in spoken form or incorporated into mission/game objectives, it’s much more likely to be recalled than if it’s presented as just written text. This research line is still obviously in it’s infancy, but I think it’s fascinating to see where entertainment and science come together.
The group participation time that followed the presentations was also pretty interesting. People, myself included, were interested in the question of whether the medium of video games could be used for selection and assessment purposes and some of the panelists seemed to think so. Discussion even turned to using massively multiplayer games like World of Warcraft to facilitate emergent leadership research and leaderless group discussions. While I don’t think that’s very likely (too messy; selection measurements need to be more precise), it’s fun to think about. I have a post about a Nintendo DS game called “Brain Age” that I’ve been meaning to make for a while, so I’ll follow up on this idea there.

SIOP 2006: Even the Effect Sizes are Bigger in Texas

SIOP 2006

Back! The 2006 convention of the Society for Industrial-Organizational Psychology (SIOP to you) was last weekend, and I’m glad I went. I’ve been to 12 of the last 13 SIOP conventions and I always look forward to them for some reason. Partly because I get to hear about all the research that’s going on without having to stick my nose into one of the refereed journals, and partially because I get to see so many old friends and colleagues. Between people I used to work with, people I went to graduate school with, and random folks I’ve met over the years, I can hardly walk 10 feet without bumping into someone to chat with. I don’t know how extroverted people do it year round.
The symposiums were actually more miss than hit this year, though that was probably due to bad choices on my part. I just seemed to have a knack for picking things that either weren’t what they were advertised to be or turned out to be just a lot more boring. I went into one, for example, that was supposed to be about getting the most out of collaborations between academics and practitioners, only to find out that it was just some graduate students talking about some killer internships they like totally worked at. Honestly.
At any rate, I’ve already written my column for the July 2006 issue of the SIOP magazine TIP, so I guess it’s time to call it quits on my hiatus and write some stuff for here. I’ll probably throw in a few updates outside of the Tuesday/Thursday schedule to make up lost time. You know, so you get your money’s worth.
The new job, by the way, is great! It’s just kind of overwhelming to meet dozens of new people, learn scores of acronyms and company-specific buzz words, and try to figure out which bathroom is closest to my office. Also, the movers in San Diego grabbed most of my clothing and packed it up in temporary storage, so I’m in desperate need of pants. I mean additional pants. I have some.