e-Talk Radio: Pettichord, Bret, 8 February 2001

[article]
Summary:

Ms. Dekkers and Mr. Pettichord talk about why testers and developers think differently, and why they should think differently.

TEXT TRANSCRIPT: 8 February 2001
Copyright 2001 Quality Plus Technologies and Carol Dekkers. All rights reserved.

Announcer: Welcome to Quality Plus e-Talk! with Carol Dekkers, brought to you by StickyMinds.com, the online resource for building better software. This program will focus on the latest in the field of technology. All comments, views, and opinions are those of the host, guests, and callers. Now let's join your host, Carol Dekkers.

Carol: Welcome to show number six of our second series of Quality Plus E-Talk! with Carol Dekkers. I am Carol Dekkers, your host. I would like to introduce you to our guest this week. Fortunately, we do not have a virtual guest. We have a live, in-person expert, tester, developer, consultant...Bret Pettichord.

Bret: Hello.

Carol: I'm sorry, I thought somebody was saying something else. We have had technical glitches and different things that have happened over the past few weeks. So, when somebody says "hi" I am not sure who is saying "hi." I am very happy to have Bret here on the show with us. He is joining us live from Austin, Texas, where you said it was drizzling today, I guess? Bret, is that correct?

Bret: Yes. It is a little overcast.

Carol: A little overcast. Well, here in Florida...this is my first time this entire winter that I can rub this in...it is the first day probably that in Florida that the temperatures are actually higher than they are up north. I think we are going to hit 70 Fahrenheit today. This is the only time I will rub it in all winter, but it is wonderful here. Before I go into a little bit about Bret's history, about what he is going to talk about, I would like to tell you a little bit about me and what I do. We help companies build better software, and that is really the theme of the show, and why we are sponsoring with StickyMinds.com, which is the online resource for building better software. My company is called Quality Plus Technologies. We help people build better software through managing by facts. In other words, using software measurements to track and control and find out what actually works with your software processes, what might not work so well. You can really track, compare, and then get to the root cause of why things are working or not working.

We developed a calendar which has on it the CMMI (capability maturity model integration) together with function point analysis and where it can contribute. If anyone would like a copy of the calendar, I would like to extend to you the invitation that we have had for the past couple weeks, which is if you send me an email at [email protected] I would be happy to send one to you.

Without further ado, the reason why we are really here today is to find out the expertise and get Bret's input on why developers and testers think differently, and why maybe they should think differently. Bret is the editor of the software testing hot list. He consults on software testing and test automation. He has been a test automation lead for IBM, Tivoli and BMC Software and a staff consultant for Segue Software, a test automation company. He has found that test automation requires the skills of both a tester and a developer. He has focused on test automation for the past eight years, before that he was a tester, and before that, he had a life as a developer. So, you have really gone the gamut. You are really an expert to be talking about this, Bret. You have been a developer, you have been a tester, you have been a test manager. You have really morphed through several careers.

Bret: Well, I suppose so.

Carol: One of the most interesting articles that you have written was featured in STQE magazine back in January or February of 2000, Volume II, Issue I. You titled it, "Testers and Developers Think Differently." For anyone who has not read it, I think it is a fascinating read. I love the way that you set it up. I think that you have a very good talent for being able to really reach the audience, Bret. So, I would like to congratulate you on the article.

Bret: Well, thank you.

Carol: What can you say in terms of your own background? What are the different things that you had to go through as a tester, as a developer? You started out in development and moved into testing--some people do it the opposite way.

Bret: Yes. I got into development and then when I got into testing it was kind of a temporary thing, because I figured development was the place to be. It took me awhile going back and forth with different jobs and realizing that I had a talent for testing and realizing some of the things about development did not attract me as much. In fact, what I found is that test automation allows me to do development work but still focus on the testing kind of perspective of things. That is probably the reason I like doing that work.

Carol: That is interesting. One of the things that you mention, I think, in the "Testers and Developers Think Differently" article, is exactly the thing that you just mentioned. That developers look at themselves as being in the best possible world, and that testers are almost like an ugly stepchild or they don't just quite know as much. You went into a fairly good dissertation in the article. One of the words you introduced, that I think was really interesting, is a "dilettante." I would like you to explain what you mean by a dilettante, and how that applies to testers versus developers.

Bret: Well, one of the things that many testers face when they are in a development group, is that the developers are kind of the ones who know everything. They know how the software works, they know the technical details of a lot of things. That is part of what developers are expected to do. They are expected to have a high degree of technical expertise. A lot of testers will feel that they are not measuring up, if they do not have the same level of expertise. One of the things that I have found is that it is very helpful for testers, actually, to be able to operate without having that expertise and part of this just comes from the fact that our products are going to be used by people who are not technology experts. In most cases, at least, that is true. So, we need to be able to come in and look at the software products that we are testing, that the company is going to be developing, and having that kind of fresh eye that says, "Okay, I can look at this, I can come and I know a little bit about the technology and a little bit about this." But it really should make sense to me and it should make sense how to use it. If you can take that kind of approach, it is often very beneficial as a tester.

I am often really happy when I get new testers on a team because they have that kind of fresh eye on things. I try to remember that fresh perspective as I work with a project for a long period of time, because that kind of reminds you that most of your users are going to have that fresh perspective. Most of our users come in and learn the product and they use right away. We are seeing fewer and fewer cases now where the software we write is used by expert users who learn it in and out and are going to take training on it and what not. So, I think that is part of what I meant by the dilettante, that a dilettante is someone who can go in and can learn a little bit about that. We have to do that with the technical stuff too. In order to communicate with the developers, we need to know their language. We need to know enough about the technology to communicate that and get the information we need that is going to help us decide on what areas to focus our testing on.

Carol: That is a good explanation, that the testers need more of a generalist attitude as opposed to specific COBOL, Java, or expertise in being able to program. I think it is very interesting the perspective that you bring. I will ask you a question. The attitude that most developers hold towards testers, is it kind of the same attitude that a lot of developers hold towards users?

Bret: Well, I think that is a struggle that we face. I actually, just yesterday, I was calling in to a technical support line with a problem that I had, and the person I was talking to said, "Well, that should not be happening." The implication of this was not that the system was not functioning correctly but that somehow I had done something wrong. Of course, it should not have been happening that way, but eventually I got through to somebody who was able to fix the problem for me. That is an attitude that really pervades the whole technology industry, it is not just the developers. It is the people who work with it, and many of us...part of our introduction to the technology is kind of learning to somehow take the blame for all the problems that we have. It is our job to get everything working. I assume that people who use computers see several errors every day and get used to seeing that as normal. The testers, I think, a good tester and a good developer as well is trying to confront those attitudes which are common in the industry which a lot of us struggle with.

Carol: Right. We talked to a guest in a previous series and we said something about the intimidation factor, that if developers have been developers all their life, if they came out of a technical field, then it is more of an ignorance having never been a user, that they just do not appreciate walking in somebody else's shoes. Some of the cross-training that goes on, might help in some of those ways. What is your opinion on that?

Bret: Oh, I agree with that. I have had some excellent testers who have been developers who have worked on projects and who have really seen the value of good testing. I know of one case of a developer who made, I think it was like a million dollar mistake, and saw what happened. He is actually a really great tester now. He knows the value of it. So, I think there is a big value there. I am always eager to get developers who have an interest in testing into a testing group. They are often very, very effective. I really try to get a mix of people in a testing group. I like to have both people who are domain experts and who have a very deep understanding of the customers' attitudes and requirements, as well as getting people with technical depth to understand how we can both build the test technology that we need in order to do testing effectively and efficiently, but also who can kind of identify the areas and review the stuff that we are testing.

Carol: It is that diversity of the way that different people look at things that really brings a value before something breaks, as opposed to having all the problems out in the world. I think you are right, that we get used to some of the software that is released because of time to market or for another reason. There is one particular manufacturer that I think of, that I will not mention, that two or three times a day I know my personal computer will come up with this nice blue screen with this very prominent white writing that goes across it and it says, "abend error" with an OE 80 digit number behind it. And what do you do? There are no numbers to call. If you call, you will probably be going through a number of different places. I think you're right, that we get used to not having our problems fixed, or taking the blame for something that happens.

Bret: Yes. I remember that I had an error like that recently. The system had frozen up and so I needed to reboot it. It would not do a soft reboot so I had to power cycle the computer. Then when it comes up, it says that I am running the scan disk because you did not shut it down properly. That the system was not shut down properly, as if I had done something wrong. The attitudes that you are talking about are actually written right into the system. You can see them in the messages that it gives to us when we use it.

Carol: I think it is even a little bit more subtle than that. I know that mine comes up and says, it is very similar to that, it says "reminder, you should always shut your machine down first. You did not power down properly."

Bret: Yeah. Okay, you probably remember it better than I do. It is the same message.

Carol: Well, I probably get it more often than you, and I get more ticked off than you might on those, but I know exactly what you mean. When you take a look at testers and the types of skills that testers have to have. I think that one of the interviews you did in preparation for the article that you wrote was with a tester named Jonathan Bach. He said, and I love his quote, he said that "testers don't always set out to break developers' software." As Cem Kaner has said, "The software comes to us already broken. We just want to find the breaks that are already there." Is that a fairly pervasive attitude that developers look at testers as breaking something?

Bret: Well, I think that there is really a big mix in that. There are a lot of developers who really do understand the value of testing and the necessity of it. They know that we are just finding problems that need to be found. So, it is a mixture of things. Sure, there are some developers who find it annoying, and you know the kind of pressure that they are under, too, because they are often expected to sign off the schedules that they really do not believe that are realistic in the first place and so that often puts them in a situation where the testers, well, they just bring the bad news. No one likes hearing bad news.

Carol: Which is not a great position to be in. We will be back shortly with more of Bret Pettichord and talking about developers and testers after these short messages...We are back. Thank you for joining us, and thank you for listening in to Quality Plus E-Talk with Carol Dekkers. This week we are talking to testing expert Bret Pettichord, who is also a consultant and who has been a developer. The toll-free number that they just gave out, is a toll-free number for you to phone in to the show. The number is (866) 277-5369. If you are thinking that 866 does not sound like an area code that is toll-free, it is a number that is toll-free. So, it is (866) 277-5369. If you would like to give any comments to myself or Bret Pettichord, or if you would like to ask any questions or share your experiences, we would love to have you phone in.

We are talking about testers and developers, two different personalities, same type of personalities. One of the questions that I promised the listeners, Bret, that I would ask is what do you think of extreme programming? Where we have developers who are actually doing the test for coding? They are having to be testers and developers in the same body at the same time.

Bret: Well, that is a good question. I have worked with one group that was using some of those philosophies in their development. I see a lot of people doing a lot of different things that they call extreme programming. So, that is kind of the first side of it. First of all, I think extreme programming to some degree is talking about unit testing. It is talking about doing testing of the individual modules that the developers are building. I think that the general theory has always been that the developers are the best ones at doing that kind of testing, and that they should be doing it. It is sad that it is not done as much as people say it is. At some places unit testing means whatever testing the developers choose to do and sometimes it is not much. So, it just varies a lot. I am pleased to see that the extreme programmers see the value of unit testing and that it gives them flexibility in development as well as improved reliability.

Carol: Would you like to take a caller, Bret?

Bret: Sure.

Carol: Okay. Hello caller.

Caller: Hello. What a wonderful conversation. This is the reason there is that wide gulf between the ultimate user and the IT department.

Carol: That is probably true.

Caller: Hello. Can you guys hear me?

Carol: Yes we can.

Caller: I am having a very difficult time, it sounds like you are a million miles away.

Carol: Well, we are across the country from you probably.

Caller: No. No, because I have called in to this station before. There is something off with the connection. I will put my comment - thank you for talking about the difference between developers and testers and then adding in the user component, which is who I am. I have noticed that there is this, sometimes haughty attitude, between our IT department when we ask for help when they have dropped a new program on us overnight. You know, and not really notify the people that there has been an adjustment or a, you know, a new program or a new addendum to the program that we are running.

Carol: Right.

Caller: And there seems to be sometimes with some of the people in IT this attitude that "don't you know what you are doing." We normally do, but there is this lack, I call it a haughty type of attitude. And that sounds like what is being discussed between the developer and the tester. There is kind of a lack of communication and each individual has a certain area of expertise, and communication seems to be the key that is lacking or an understanding of each other is what's lacking. Am I being correct?

Carol: Yes.

Caller: And I will hang up because I can barely hear, and I will be able to hear your answer over the radio better.

Carol: Okay.

Caller: Thank you all for your program, it is very interesting.

Carol: Thank you. I appreciate you calling in. Bret, do you want to comment on that?

Bret: Sure. I think that is true that there is in general a certain, what you call the haughtiness, that people who become technology experts, which includes both the developers and the IT people, that they acquire. I think that is how things are kind of set up and that when you start this kind of hazing process or something of becoming an expert and you kind of have to go through that to get there. Not all get that way and some are able to hold on to their compassion for the user. But a lot of them do, because they get told repeatedly well you did this wrong and you did that wrong, and you should install it. I think a lot of that just comes from the hobbyist roots of the software industry, especially of the PCs and what not. I think it is kind of time for us to let go of that. It has been with us. It is something that it can get involved between the testers and the developers sometimes.

I actually wanted to make a comment about something that you said earlier, you said that your company tries to manage by facts, and that is exactly what I think testers do. The testers are the people who document the facts that the managers can then use to manage by. And that, I think, is a big approach that the testers' minds have. You know, lets not talk about how it should work. Lets talk about what is actually happening. What has been observed. What is it actually doing for us. And there is an approach of the developers of that should not be happening. You must be doing something wrong. We refuse to talk about what should or shouldn't be happening. We refuse to take that as an insult to us or anything, and just say this is what I saw. This is what I put in. This is what I got out. This is what is happening for me now today. We do a really big service to everybody by focusing on the reality and on the facts and not on the theory of how it should be working but on what really is working.

Carol: We will be back with more of Bret Pettichord and his insights after these short messages...Welcome back to Quality Plus E-Talk with Carol Dekkers. We are in the midst of show number six where we are talking to Bret Pettichord who is a consultant, a tester, a developer, and he has written an insightful article called "Testers and Developers Think Differently." We have been talking to him for the last half hour just about why they think differently, what are some of the instances. We had a caller who asked and mentioned the gap between the communication between developers, testers, and users. I would like to give out our toll-free number one more time. It is area code (866), that is a toll-free area code, (866) 277-5369. If you are interested in participating as a caller, as a commenter, or as somebody that has a question for Bret. Welcome back to the show, Bret.

Bret: Thank you.

Carol: Bret joins us from Austin, Texas. We have been talking about testers...what is the difference between testers and developers? Bret is in a unique situation having been both, to be able to comment on both perspectives. One of the things that he mentioned just before we went to break is that by managing by fact and doing measurement, that is one of the things that testers do. That testers are really providing measurement. One of the seminars that I did when we were talking about defect and defect tracking and the quality of software that is built. One of my participants who works at Symbol Technologies said, "You know what we do," she said "before we release a project, before we release a piece of software," she said "we track the defects." She said, "One thing that we found to work very, very powerfully between testers and developers is that rather than calling them defects, before the product is delivered, we have renamed them 'saves.' So, we say look at all of the saves that the testers have found and that the developers have fixed before it went to market. If they are found and detected after going to market, those are the real defects." She said psychologically this has really helped Symbol Technologies springboard ahead and actually get away from some of that emotional tug-of-war that happens between developers and testers.

Have you experienced anything like that, Bret, at all?

Bret: Well, that is a new term, I have not heard that one. I have heard endless debates on what they should be called and what they are called. I do not have strong opinions on that. I have one friend who likes to call them "issues." He says it is just an issue, it is something that is brought up. James Bach defines a bug as any concern about quality. I think that is also kind of meant to address the same idea that this is not that somebody did something wrong, but it is just an expression of how we can make things better.

Carol: And it is a discovery process. It is not like the testers put those bugs in. They are there to begin with.

Bret: They are there to begin with. You know, bugs come from all kinds of sources. I think that most people think that the average bug comes in there because some developer did not think straight about something and they made some kind of coding error. But I find that actually a lot of the bugs are actually caused by communication errors between people on different teams. So, everyone's part of the program works the way they thought it was supposed to work but they had different ideas about how the different parts of the program were going to be communicating with each other. I find that is actually a big source of problems anyway. You do not want to start saying, "Well, you did it wrong or you did it wrong." Let's just try to find a way to make this all work together.

Carol: James Boddie wrote a book called The Information Asset a number of years ago. One of the things that I thought was very, very powerful in what he said in his book, was that you should look at software, at system development, that developers should look at the software as a tool. As a tool to help the users do their jobs better, rather than an end product. So, if developers looked at it, that they are delivering a tool. They are delivering a hammer. They are delivering a screwdriver that will enable the users to work better. Then if the screwdriver doesn't just quite work because it is too big or too small for the screws, you go back and you fix it until it will work. And that is a different way of looking at things as opposed to this is an artistic creation, look at what I have done, feast your eyes on it, and be happy. Look at how great this software is that I built to meet your needs. It is a different way of looking at it.

Bret: It is.

Carol: I think that part of it - I did a use cases course and we talked a little bit about this at break. One of my clients was very, very astute, Nielsen Media Research, and they were so astute, that I hold them up as a shinning example. What they did, is that they realized that use cases are a developer's tool but they are also a user tool. So that if you just teach it to developers, and you have the developers go in and write the use cases and present them to users, that they are going to be in technical language. Well, we did separate workshops, one for users and one for developers and then brought the two together at the end. We had QA people and testers who were part of the project team who took these use cases courses. One of the most insightful things that one of the testers said was "You know what, we get to be involved a little bit. We get to watch the requirements process, and then we are told what the requirements are and we are supposed to go off and write all of our tests, as the developers go into this artistic mode. And they said often times when there are changes, which there inevitably are, we do not find out the changes until after we have tested the wrong things or we have gotten unexpected results." Do you find that this happens? That testers are not part of an integrated team to start with?

Bret: Well, I think that is something that many teams are struggling with in trying to find ways in which to get the testers in to know more about what is happening with the day to day development and where things are going on. I always find that it helps to have that. I find that part of it, I think part of getting there, is being able to work with the developers cooperatively. I think when they see testers as someone who is just going to come in and really sap a lot of their energy, they are saying, "Well let's hold off on that." If you can engage the developers in a positive way, which is when they see it was helping them and helping them to get where they want to go, then they do not have any reason not to have you come in early on a project.

Carol: Would you agree that it goes both ways? There are some testers who think of themselves as prima donnas, as the final inspectors who can stop a project on a dime, and they are really the ones in power. And then you get the dysfunctional developers who think that they are the prima donnas of the world and that no tester is going to tell them the way to run things. Do you see kind of a prima donna complex on both sides at times?

Bret: Well, it certainly can happen. Especially, I know that a lot of testers come in as kind of junior programmers, this is not uncommon for companies to do this. So, to some degree they need to prove themselves, and the way they prove themselves is by finding the flaws in other peoples' work. That is often how you prove yourself in any kind of academic or any kind of sphere where you have to prove that you know more than other people. That is how you gain status. So, testers can be put in that position, especially if they are just coming in and are trying to prove, or maybe they think they could do a better job programming than the developers. So, they want to just kind of shoot things down. It can happen. That is one of the reasons why I do like to tell testers to focus on the facts, and don't say, "Well, you should have done it this way." Then we are just kind of adopting the same attitude that we are trying to confront. Instead, saying, "You know, here is what happened, this works in the following way, and we think it should work differently, but it is your job to figure out how to make that happen."

Carol: Right. I think you mentioned something very important that if testers are brought in as junior programmers and once they get promoted they go into "real development" where the real work gets done. I think that creates kind of a hierarchy which if you are a senior tester and you are excellent at what you do, and you love it, it is not that you have not been promoted into development, it is because you have different skills. I think that we need the experts in both areas.

Bret: That's right. Finding someone who enjoys testing, and of course, testing is one of the things that makes it difficult is that not very much of it is taught. In school, people are taught when they come out of a computer science program, if they're lucky, they may have had a week's instruction on software testing. So, it is not surprising that they do not want to do something that they never have been trained to do.

Carol: Everybody kind of rejects something that is brand new, especially if they think that it is beneath them. One of the things that you said in the article, on modeling user behavior, you said good testers model user behavior, focus on what can go wrong and focus on the severity of problems. Whereas good developers should be modeling system design, not user behavior. Focus on how it can work and focus your interest in the problem itself and the problem domain. So, that is coming in with two completely different perspectives, it is no wonder sometimes that there are clashes.

Bret: You're supposed to. There is a productive conflict that is supposed to happen with testing and development, and I think that is the goal for everyone, is to make sure that there is no conflict but that it remains productive and remains focused on the facts and remains focused on how to improve things, like you commented on finding saves rather just finding blame.

Carol: Right. And testers really need to be a user advocate, I guess, and a developers' advocate. I see them somewhat akin to a business analyst in that they are really kind of straddling two domains. They need to know enough about the development environment and enough in that area. They also need to know a lot about what happens in the user community. One of the things that you mentioned was that testers need to be able to model user behavior, such as mistyping, such as typing something in that might not be in the right sequence, or actually modeling what happens in the real world.

Bret: Well, that is a big area. Most software is designed to work well when it is used the way it is supposed to be used. As you know, whenever we try new things, we make mistakes. So, I always try to put a lot of attention on my testing; in fact, whenever I make a mistake in doing something, I say well is this a mistake that the user might make, because if so, then I am going to want the programming to be accommodating. I am not going to want it to destroy all of my data, I'm not going to want it to give me cryptic error messages. I am going to want it to somehow be polite to me, and help me back as we move along. And, I think that testers can do a lot of help with, is to try and look and usually those things are not specified very much in it. So, it is often when the testing comes in where we will really get to focus on what is the program behavior going to be for handling error messages.

Carol: Right. We will be back shortly with more of Bret Pettichord and talking about developers and testers after these short messages...I am back for our final segment of this show. We have Bret Pettichord, who is a lead developer, he has been a tester, he is a test manager, and a consultant. We have been talking about testers and developers. We have a caller on the line, Alex. You had a comment that you would like to share with us.

Caller: Yes. I would like to comment on the relationship between the developer and the tester. I believe that the developers and testers are in a kind of equal basis but the big difference is that they are looking at the same kind of product or module with a different perspective. The tester will try to see the whole thing with a kind of broader kind of width and the smaller kind of depth and exactly the opposite is the role for the developer. The developer will concentrate on making more depth in his own module, and he is not going to care so much from the width of the kind of scope. This makes these two entities, if I may say, very complementary to each other. But the main key point here is that both of them and the whole organization has to believe that they are on a completely equal basis. And that there are not such things as the developers are better or worse than the testers and the other side too.

Carol: Okay. Thank you for -

Caller: I believe from my experience that most of the organizations here, they pay more attention to the developers and not even anything, they do not even pay attention at all to the testers. This is where the problem starts.

Carol: That very well may be true. It may come through in salary as well, that developers may get paid more than testers.

Caller: Right. I believe personally that the developers, a software person, a software that I call developer tested, it has to be in his career or her career a kind of going back and forth to the two sides of the coin. Because, in order to be a good tester, you have to know how to code pretty well.

Bret: I think that is a good point. I have worked with a lot of teams where when a product is in development, the testing staff tends to be the most expert at what the product actually does and how it works. Many of the developers, like you said, have to go in deep. The system is decomposed and each developer is given a different module to work on, and they know how their module works real well. They have some interfaces to the other pieces, but they do not really understand the whole flow through the system. The testers are really the people who understand that the most. They have tried it many times and they can really see projects where they are the ones doing demos or where they are showing people how it works or what not. And even sometimes they are training the new developers because they are the ones who really understand how the whole system is put together.

Caller: Right. May I add something here?

Carol: Sure.

Caller: I believe that the testers, they have to be involved with the whole process from the very beginning. They have to participate in all the kinds of designs reviews, because they can input their own point of view that will prevent a lot of bugs, if I may say.

Bret: I think that there is a lot that testers can add when they come in early on a project. I have had testers who have found problems in different design documents up front which often, in my opinion, have prevented problems from showing up in the code later.

Carol: Excellent. I think that is excellent insight in terms of equal footing in saying that testers need to be involved up front. Maybe we can avoid some of the problems at the end of the lifecycle that we see right now which is kind of an antagonistic attitude towards each other. So, Alex, I would like to thank you for calling in. We have a second caller. You are on the air.

Caller: Hi Carol, my name is Greg.

Carol: Hi Greg.

Caller: I have a quick question for you and Bret. I do Web development so I kind of come at things from the developer angle. I was just wondering who you think would be the logical choice to be like a Web usability expert, particularly, in light of the short development cycles for the Web?

Carol: Good question. Bret, I will let you jump in there first.

Bret: Okay. I am not sure that I got the whole question. He was asking who would be a good usability expert.

Caller: Yes, particularly, given that the Web's development cycle is so condensed.

Bret: Well, I have done usability testing myself, and the kinds of testing that I have done that is that we try to get people in who are similar to the user community, and we observe how they use the product when it is in development. Now, that can be hard to do because you cannot really do that kind of testing until you already have something that is ready to go. By then, it may be difficult to change. Some people like to use the paper prototyping method, which I have not had a chance to use, but I would like to sometime. Where you just try to lay out the screens and you could do it in paper or say fake it in straight HTML. You could say, here is what you are going to see, what would you do when you see this screen. What would you click on? What would you think? And walk people through design that way. I have seen good results from that kind of technique for developing and of the user understanding of the product.

Carol: I will add my two cents worth when we get back from these short messages, and we will summarize this week's show with Quality Plus E-Talk with Carol Dekkers. We will be back shortly...We have been talking this week to Bret Pettichord who is part of Pettichord Consulting. Bret, would you like to give out your Web site address for anyone that is interested in going out and seeing it?

Bret: Sure. It is at pettichord.com, and you can find my software testing hot list there as well as some other stuff about what I do.

Carol: We have a link on our Web site that goes directly from our radio show listing to your particular Web site if you go under Bret Pettichord for show number six. The caller that we just had, Greg, asked what would be a way of testing the usability and who could do the testing? One of the things that we found with our own corporate Web site, is that sometimes we are not sure exactly who is shopping or who is buying because the Internet and Web development is absolutely open to anybody. Anybody who previously may or may not have phoned you, now is really on a 24-hour access, 7 days a week, and can come in and look at your site, and browse and look for things. I know that Software Quality Engineering, one of the things that they did with the stickyminds.com Web site is that they actually put out a prototype and asked their users, what would you like to see? I think that sometimes, we in development, we who are in technical fields, think that it is cheating to ask users what they want and then give it to them. That is just too easy. So, I think sometimes we just have to bite the bullet and say what do you want and then give people what they want. Do you agree, Bret? What do you think?

Bret: Oh yes. I think there are a lot of ways. I think that developers in general like being told what to do. They are problem solvers. Give me a problem and I will figure out a way to make it happen. So, they want to get that kind of information.

Carol: I think that we have complementary skills. I think one of the things that we have really learned on this show, is that developers are specialists. That they have to be technically component and go to the depths of programs and system development. Testers need to be generalists. Testers need to be able to have a wide appreciation and sometimes even of the interaction between systems and how users may behave and some other things that they may do wrong that the system was not supposed to do from a developer point of view but that it does. I would like to thank Bret Pettichord for giving us his hour of insight and spending the time with us to talk about testers and developers. So, thank you for being on the show, Bret.

Bret: Thank you. I have enjoyed it.

Carol: It has been a pleasure. I keep having better and better guests. One of our comments that came in was from a person who said you just keep getting the cream of the crop guests. I can absolutely agree with that. I have just been very fortunate. I appreciate Bret spending his time. Next week we are hoping to have Elizabeth Hendrickson on our show who is going to be talking about evaluating tools. What are the types of things that you need to do? What are the types of questions that you need to ask about any software tool if you are going to go and evaluate them and compare them? So, we are hoping to have her next week. The week after that is kind of one of the gurus of software development. Tom DeMarco will be here from the Atlantic Systems Guild. Interestingly enough, he is going to be talking about risk management and making systems work. I am also going to ask him a little bit about his brand new completely fiction book. He has written a brand new mainstream New York Times Bestseller called, it is something Hill. It is about a boy in Maine and growing up. It is a complete departure for Tom DeMarco. We will have an interesting show then, talking about risk management and system development. I will slide a few questions in about his new fiction career. We are also going to have after that David Zubrow who is a senior member of the technical staff at the Software Engineering Institute. He will be freshly back from the Software Engineering Process Group Conference in India. We will be talking to him about CMM (capability maturity model) integration advancement and what it means around the world. We are also going to be having Jim Highsmith who is one of the people who is adept at software development. Then we are going to have Stan Magee and Peter Voldner who are ISO experts, international standards. They are going to talk to us about how do you choose a process standard that will really work for your environment. We have an exciting lineup of shows. Keep an eye towards stickyminds.com and to qualityplustech.com for more information about upcoming shows. Bret, what would you like to say just in closing to our listeners?

Bret: We have been talking about testers and developers working together, and I think a big part of that is appreciation. Just to show that I do appreciate the developers that I have, I just wanted to point out the one trait that they do have is that they have a great deal of optimism that they see a different ways of doing things and they want to make that happen. I think that by appreciating the skills that everyone has to bring into the group, we all have a better product as a result.

Carol: Thank you very much. I appreciate all of you whether you are developers, testers, or interested third parties. Thank you for listening, and we will talk to you again next week for more of Quality Plus E-Talk, we will E-talk to you in a week and then the weeks after that. Thank you.

Copyright 2001 Quality Plus Technologies and Carol Dekkers. All rights reserved.

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.