Researchers Study Software Gender Gap
Sunday, September 23, 2007; 12:17 PM
SEATTLE -- For more than a decade, academics and technology executives have been frowning at the widening gender gap in computer science. Everyone has a theory, but no one has managed to attract many more women.
Now, some computer science researchers say one solution may lie in the design of software itself _ even programs regular people use every day.
Laura Beckwith, a new computer science Ph.D. from Oregon State University, and her adviser, Margaret Burnett, specialize in studying the way people use computers to solve everyday problems _ like adding formulas to spreadsheets, animation to Web sites and styles to word processing documents.
A couple of years ago, they stumbled upon an intriguing tidbit: Men, it seemed, were more likely than women to use advanced software features, specifically ones that help users find and fix errors. Programmers call this "debugging," and it's a crucial step in building programs that work.
Beckwith decided to investigate why women and men might interact so differently with the same software. She pored over 30 years' worth of books and academic papers from psychologists, education researchers, economists, computer scientists and others about gender differences in problem solving and computer use.
One theory grabbed her attention: High confidence correlates with success. Both men's and women's confidence in their ability to do a challenging task affects their approach and the outcome. And most studies indicated that women _ even ones who study computer science _ have less confidence than men in their computer skills.
So Beckwith wondered, could that be one of the culprits? Are women less confident than men when it comes to software debugging? Are women less willing than men to try using these advanced features?
Beckwith tackled these and other questions in her dissertation, with guidance from Burnett and Susan Wiedenbeck of Drexel University.
She started by asking a group of women and men, in a questionnaire, whether they believed they could find and fix errors in spreadsheets filled with formulas.
Then, she sat them down in front of a computer with two spreadsheets. One tracked students' grades, and another calculated employees' paychecks.
Beckwith buried five errors in each one without telling the participants. She gave them a time limit and asked them to test all the formulas and fix any bugs.
The program included a debugging feature that helped the users spot miscalculations by the formulas underlying the spreadsheet and other errors. When they clicked on a number that seemed wrong _ a grade point average that looked too low, given the student's test scores, for example _ cells in the spreadsheet grid that contained the possible source of the error changed color. If the participants were sure a formula or value was correct, they could check it off.