We recently sat down with Associate Professor Cathy Smith, Ph.D. to talk about the School of Information’s recent revision of the User Experience (UX) Master’s Degree program and the future of this field.
CCI: Well, to start with, what does user experience mean, what does it encompass?
Cathy Smith: User experience grew from a recognition that successful designs are centered on deep knowledge of the ultimate user of a thing. Who is using the thing and what happens to them before, during and after they use it? The thing can be tangible like a dishwasher, intangible like a contract or abstract like a process. And the need for the user’s view has only gotten more imperative as the world becomes more digital with all manner of digital interactions in modern life. The goal of user experience is to improve users’ lives by making things easier, safer and more rewarding.
CCI: What attracted you to the field?
CS: Back when search engines like Google were just starting, it was an exciting time to learn about information system design and how to make it better. I had designed a “database-driven website,” the kind of site that presents different information depending on what the user needs. Its search was for date, media type, genre—very straightforward. Searching for subject matter or point-of-view was much more difficult, and I wanted to create a system that would help users with ambiguity and sentiment.
CCI: And this brought you to ֱ State eventually?
CS: ֱ State had one of the very first interdisciplinary programs that combined user experience with information science. Also, the College of Communication and Information had an interdisciplinary Ph.D. program, which was the type of program I attended at Rutgers University.
CCI: From your research and studies and teaching, what do you believe is the role of psychology of information seeking in a user’s experience online?
CS: That’s a great question! One key design challenge has been a system’s ability to understand a user’s context and to close the gap between their mental lexicon (the words they use) and the system’s limited communication abilities. Recent advances in artificial intelligence are a direct outgrowth of these design ideas with the goal of improving information search.
It’s easy to build a propaganda machine. What’s harder to do is build an anti-brainwashing machine.
CCI: You were recently tasked to make restructures to the iSchool’s UX program. What changes did you make to the UX program and why?
CS: We talked with students, alumni, instructors and people from industry as part of an intensive review of the program. Two things we heard repeatedly were the need for more flexibility and the chance to go more deeply into topics. Both of those were accomplished by reducing the number of credit hours to get the degree and the number of required courses. It's now only 30 credit hours, not 36. And half the program credits can be earned in a subject area of the students’ choosing. There are also more choices in electives as well.
There is also a third important change: an innovative approach to the final required course. The new program integrates processional development goals into every class, so students are thinking about their goals and career preparation from day one. The need to “begin with the end in mind” was very clear in student comments.
CCI: So where is user experience headed in the future?
CS: This is a growing field. Companies, governments and service organizations recognize the value of understanding what it is to like to use the things they design. Future changes to the profession will include more effective tools for data collection and analysis. Why do users like this? What about it do they like?
The goal is to improve users’ lives by making things easier, safer and more rewarding. But that requires trust. You want people to have a healthy sense of questioning what they’re being given by the machine. And that’s part of information science. What does it mean to authenticate something? It’s really challenging. It’s deep stuff. There’s a whole area of research around making much more reliable tools, tools that can be questioned, tools that can explain themselves. How can you set up a learning system that builds trust but prevents indoctrination? It’s easy to build a propaganda machine. What’s harder to do is build an anti-brainwashing machine.