Tag Archives: research method

Using Mixed Methods in Research

Using either quantitative or qualitative methods of research is insufficient in addressing many research questions. This article explains the merits of combining both methods.

Most of us are all too familiar with the terms quantitative and qualitative research methods. The quantitative research method, as the term connotes, focuses on the analysis of a discrete set of ideas. Discrete, in this sense, means distinct ideas that can be subjected to the four statistical scales of variable measurement namely nominal, ordinal, interval, and ratio. The quantitative approach is commonly used in the hard sciences such as the biological and the physical sciences, where the subjects being studied can be amassed in great quantities. This research method, however, could not be used more effectively in situations that require an in-depth understanding of the subject being scrutinized, particularly humans, who are regarded as complex beings whose behavior cannot be simply explained by numbers or simple quantification. It is viewed that the quantitative method provides a superficial answer to the issues at hand. The qualitative method, thus, comes into play.

scientist

Purpose and Scope of the Study

The above explanation boils down to the realization that what really matters in the performance of research is for you as a researcher to be very clear about your purpose in doing research. What are your objectives in your investigation? What, really, is the purpose of your study?

The purpose or objectives of research define the direction of literature review, research design and data collection. You cannot afford to study all aspects of a phenomenon. It makes sense to limit your scope on those items that you can adequately answer with the resources you have at hand.

Two major questions arise that will help you shape your objectives:

  1. Are you competent enough to pursue the topic, and
  2. Do you have the time, money, and willingness to work towards the completion of your research?

Why Mixed Methods?

In every phenomenon, the 5Ws and 1H of reporting symbolizing the What, Why, When, Who, Where and How of a phenomenon, provides a complete picture for a thorough understanding of a point of interest. Applying the quantitative method answers the What, When, Where, and How Many of data in the form of numbers. It does not answer the Why‘s and How‘s of the phenomenon. Hence, the qualitative method comes in. Combining these two methods give rise to the recently getting popular research approach termed as Mixed Methods, which captures both the breadth and depth of information. Mixed Methods is more encompassing; a more comprehensive answer to research questions is arrived at.

Quantitative and qualitative research methods complement each other. thus, the use of Mixed Methods is a recommended approach especially when dealing with human behavior.

© 2013 December 17 P. A. Regoniel

How to Use Research Tools in Managing Your Finances

How do you manage your finances? Do you realize that you can use research tools in effectively managing it? Read the article below and be victorious in handling your monetary affairs.

Some people find it difficult to keep themselves free of financial concerns. Many of these difficulties lie in the kind of lifestyle that people adopt in their daily lives. But what kind of lifestyle do you have and how does this impact to your financial status?

There is no generic way to describe what really transpires in one’s financial life unless an objective approach is applied to examine it. It may be that you believe you are doing just fine. It’s just the circumstances around you that make it difficult for you to make ends meet. 

Most of us have this inclination, we tend to believe what we perceive. But we could be wrong in this presumption. For this reason, doing a little research and using research tools can empower you to help you  manage your finances better.

How can research tools be used in handling your money? Here are the things you can do to see things clearly and apply the correct approach to bring you out of perennial debt.

1. Use graphs to look at how you spend your money

You cannot really say you understand fully your spending behavior unless you list down where your money goes. Every time you buy something, take time to record the transaction into a notebook dedicated for that purpose. Don’t be tempted to use an electronic media as you will tend to lose it if for any reason viruses attack your computer. Your expenses are not also visible and easily accessible to you. If you have a hard copy, you can grasp it at an instant.

Do your recording consistently for one month and graph your expenses at the end of the month. This is easily done using a spreadsheet application such as Excel or Open Office Calc. The best graph for this purpose is a bar graph. Those that you spend most will rise above the rest of your expenses. Look at the three highest bar and contemplate if you can still bring them down or make a little adjustment.

2. Get a good, objective perspective to guide your thinking

Many people believe that they need everything they spend on. Wants may actually be viewed as needs by different persons. For example, having the latest high-tech cellphone satisfies the need to interact with the latest technology. That could be considered a need if that itself satisfies a person and he has the means to acquire it. But for a person who does not have the means, it will be just a luxury. So, the classification of needs and wants ultimately lies in the eyes of the one concerned.

Adopt a perspective that suits your present financial capacity. Don’t fall into the trap of keeping up with the Joneses because the Joneses live luxuriously. You do not know the real score behind their way of life. It may be that they are in a quagmire of debt.

3. Use a theoretical framework to prioritize your needs and wants

What needs do you really have to fulfill? You can reflect on Maslow’s Hierarchy of Needs to have a better idea where your belief stands. For example, the basic need at the very bottom shows that people need to breathe fresh, clean air; drink clean water, wear appropriate clothing to warm the body or keep it from the elements, and get a good sleep. These needs must be satisfied first before going up the next level. The other items towards the top describe the higher level needs when all the needs below are satisfied.

Why use this theoretical framework by Maslow? A theoretical framework sums up, in a most critical way, the results of many studies. For this reason, the framework represents what you really need to fulfill.

4. Satisfy all your needs first before your wants but first define what is a need and what is a want

In research, you need to be clear with what terms you are using. Is your definition of need clearly distinguishable from want? What is really a need? How about the want?

The rule is you need to satisfy first your needs before your wants. This is just fair because needs are designed to be fulfilled mainly for survival, while wants can be done without. Wants can be ignored. This is the reason why many advertisements always appeal to one’s vanity to get products across which, actually, are not vital to one’s survival.

Needs and wants, therefore, can be defined thus:

  • Needs. These are goods (and/or services) that are vital to one’s very survival. Non-satisfaction of these needs lead to loss of life.
  • Wants. These are goods (and/or services) that one can avail to enjoy but can live without.

Notice that ads are trying to make you believe that the things they sell are really your needs. But if you ask yourself if you really need those things to survive, chances are, the answer is “not at all.”

5. Systems Thinking: Don’t spend more than you can earn and spend the minimum possible

Adopt a systems thinking in managing your finances. What comes in must equal what comes out. Don’t put out more than what comes in.

Imagine yourself stranded in an island surrounded by sharks where your only means of survival are goods that come your way from a sunken ship. Your consumption will entirely depend on what goods land on the island. Your best strategy to survive is to live on the smallest amount to keep you alive each day and for a long time.

island

If you have a credit card, this is one way that you will spend more than you can actually earn. Spend less than what you earn. If you want to spend more, earn more. This is just a matter of discipline. If you don’t have discipline, don’t use a credit card. Always pay your credit on time to prevent being buried in debt. Stay within your credit limit and don’t increase your credit limit if you are not able to pay your debts on time. 

Lastly, what matters most is application. Whatever things you learn but unable to apply will be for naught. Take action and manage your finances well.

© 2013 December 7 P. A. Regoniel

How are Crocodilian Populations Studied?

Crocodiles are an important component of the aquatic ecosystem. To keep track of and manage their populations in the wild, spotlighting surveys are conducted. This article details how this is done.

Recalling those days I used to survey crocodilian population as part of my job, I thought of writing this article for those who are interested on how the population of these magnificent animals are being studied. There’s a rush of adrenaline every time I do the spotlighting survey with my team especially in areas where there are large crocodiles sighted. We do this at night along the river, from 6am to 12 midnight, amidst thick stands of mangroves for greater chance of seeing these nocturnal animals.

Materials for Crocodilian Survey

What materials are required to survey crocodilian population? Here is a list.

  1. A fiberglass boat with an outboard motor
  2. A bright spotlight with a 12-volt car battery
  3. Dark clothing
  4. A 3-meter pole with a diameter of about 1.5 inch diameter

Understanding the Risks of Dangerous Animal Surveys

Is it safe to survey these dangerous animals like the saltwater crocodile (Crocodylus porosus) in the wild? There are risks of course, but armed with an understanding of the behavior of crocodiles, there is no need to be too apprehensive. As long as you have presence of mind, you can ward off possible attacks.

Crocodiles are generally shy and wouldn’t attack a boat with bright spotlight at night as the only source of light. Unless it’s a very big male that tries to defend its territory from intruders. In fact, the use of the spotlight by crocodile hunters almost decimated wild crocodilian population in many parts of the world. Their tell-tale red eye reflection give a clue of their presence, either floating along in the river or basking along the riverbanks.

fiberglass boat
The author with croc survey buddies back in the early 90s.

Why is the spotlight approach an effective tool in surveying crocodilian population? It essentially causes “blindness” to crocodiles when light is focused directly towards their eyes. They cannot see who’s behind the spotlight and will be confused. This is the reason why the spotter needs to wear a dark clothing so that there will no visible object that the crocodile  can aim its nasty, hard clamping jaws that bites with a force equivalent of two tons.

What is the 3-meter pole for? This is just a precaution in case a crocodile attacks for some reason. You can use this to jab on the body of the crocodile and cause it to land somewhere else. They easily tire with sustained effort.

One of the reasons for attack is when someone approaches a nest or mound made up of leaves and mud that contains the crocodile’s eggs. Maternal instinct dictates that the female crocodile has to protect its young. Crocodile spotters should avoid these areas which are usually found in the freshwater portion upstream.

Spotlighting Survey

How does spotlighting survey work? Well, it’s simple. The spotter, holding the spotlight, stays in front of the boat and scans the horizon. He also gives direction to the boatman. Placing the eyes at spotlight level, it is easy to spot the crocodiles because their eyes reflect a gleaming red. The number of crocodiles could then be counted.  To get their population density, you just have to divide the number of sightings by the number of kilometers covered by the survey.

man-eating crocodile
A 17.5 foot man-eating crocodile culled out of its habitat.

If you want to compare the density of crocodile population between or among rivers, looking at the density will give you useful information for management purposes. Management means removing “problem crocodiles” like the one at left photo and maintaining the population at levels that do not threaten the human population. This management approach is applicable only to man-eating crocodiles. There are many other kinds of crocodiles in the world.

Why Should the Crocodile Population be Conserved?

Crocodiles are an essential part of the aquatic ecosystem. They are known to enrich the waters through their excretions and secretions. Their body wastes enhance algal growth that serve as abundant food for herbivorous fishes thereby powering up the river food web.

Scientists once discovered that overhunting of crocodiles led to the significant reduction of fish population in nutrient poor rivers of the Amazon. Also, a study in Africa showed that once crocodiles were hunted aggressively for their hide. As a result, the population of hippopotamus surged and upset the balance of the aquatic and nearby grassland ecosystem. Thus, these highly resilient reptiles reminiscent of the dinosaur age 200 million years ago should be conserved.

© 2013 October 29 P. A. Regoniel

Five Memory Improvement Tips for Researchers

Field encounters become more meaningful when you gather and recall as much detail as you can. If you do not have a system or technique to do it, it will be difficult. Here are five memory improvement tips that work.

Good researchers need a sharp memory; if not, then they will be losing valuable information from field encounters. The ability to recall things is not a matter of genes alone. It can be developed with practice and through the use of creative means.

If you have trouble recalling things, then the following memory improvement tips will help you remember those things observed in the field.

Five Memory Improvement Tips

1. Bring your notes and pencil/pen with you.

Notes are indispensable tools to help you recall things. Jotting a few words or phrases on a small notebook will remind you of important statistics such as the number of people who benefited from a project, the number of animals that your interviewee spotted, or the frequency of sightings. When your automatic audio or video recorder fails, you always have your notes to write on.

2. Always tote the indispensable camera.

For the modern researcher, the camera tells a thousand words. You can describe the study site by just a quick shot of the landscape. Or document a new species which cannot be easily trapped. Setting it in video mode will help you capture people’s conversations and reactions useful making those available during post-field work analysis. Just make sure you have the memory card inserted inside the camera and the battery is full.

If your camera is not weatherproof or waterproof, bringing along thick, transparent plastic bags to wrap your camera in when unexpected rains come. You will need a rubber band to seal it dry.

3. Always bring a young assistant with you.

Why do you need a young assistant with you? Well, that’s simple. Younger people tend to recall things better than aging researchers who spent most of their time studying and narrowing their frame of mind as a result of specialization. If you are a professor, choose a student who performs well in class.

4. Consciously mark things with objects and arrange them in pairsbottle and twin

It is easy to recall things when you associate them with something. For example, you can represent a series of numbers with the following objects: 1 – a bottle, 2 – a twin, 3 – a tree, 4 – a paper clip, 5 – a hand, 6 – a man with six-pack abs, 7 – a sickle, 8 – a twisted tire, 9 – a cat, and 0 – an egg.

How do you use these numbers to represent a series of numbers say a telephone number? Easy. If you want to recall 432-7812, associate this number in pairs of objects representing the numbers in the previous paragraph. This number converts into the following symbol combinations: paper clipped on a tree, tree hugged by a twin, one of the twins holding a sickle, sickle cutting a twisted tire, tire containing a bottle, and bottle with two small twins inside.

Now, imagine those objects and their relationships in your mind. Whether you recall them forwards or backwards, it is easy to get back to the numbers. Else, you lose an extension of your memory.

5. Associate events that you want to recall with a traumatic incident.

Traumatic incidents in your life are easily recalled. Associate an important event in your investigation with a traumatic incident in your life. Although recalling that unpleasant experience in your life can make you feel bad, it can help you recall important things. In so doing, you will also allay that horrible, nasty, obnoxious or bad experience of yours.

Try these memory tips and see a difference in your research output.

© 2013 September 19 P. A. Regoniel

Examples for Research Design Development

How do you come up with your research design? Here are two examples of blood pressure exploratory studies as leads toward research design development.

Blood Pressure Exploratory Study

I find the practical aspects of applying research enjoyable and designed experiments to uncover some relationships or to resolve my problem.

Several years ago, I convinced my doctor to cut my blood pressure drug maintenance. I simply presented to him a graph I prepared using a spreadsheet application and an Analysis of Variance (ANOVA) to compare my blood pressure readings with the full dosage of the drug, half of it, and a fourth of it. I also compared the two groups at a time using t-test and I got the same results. The graph and the statistical analysis showed my blood pressure readings did not show a significant difference as I gradually reduced the dosage of the prescribed drug.

The Diet Experiment

The primary purpose of the above experiment is to see whether diet can produce the same results as a drug in lowering blood pressure. One of the active components of the drug in question was potassium. I thought it is better to take natural food to get the mineral. I computed the amount of potassium that corresponds to the dosage by eating a number of potatoes that will supply such amount and gradually reduced the drug dosage while monitoring my blood pressure daily.

From my musings on the nutritional value of potato, it would take about two to three medium-sized potatoes to get the same amount of potassium. So, I took at least three potatoes a day to correspond to the amount of potassium required as I reduced on the drug dosage.

When I came around 1/4 of the prescribed dose, I asked the doctor if I can forego the drug because I suspect it is one of the reasons why I was feeling weak. And I confirmed it by finding information on the side effects of the drug. The doctor was amazed because I reduced the drug to such a small amount that, according to him does not anymore provide substantial benefit in lowering blood pressure. My blood pressure has stabilized. He said I can forgo the drug and return to him when my blood pressure rises again.

Despite this apparent success in my desire to live without the drug in my system, I do not recommend this approach to anybody because it might work differently to different people. I take certain precautions when I do conduct studies on myself.

Blood Pressure and Exercise Experiment

exerciseRecently, I’m at it again. This time, I just would like to verify if indeed exercise provides the benefit of lowering blood pressure. My readings say so and I would like to personally find out what the numbers will show. I monitored by blood pressure before exercise, right after exercise and 15 minutes or more after my exercise so that my blood pressure will stabilize at rest.

I just started this last week and saw a trend just from three readings. I show the results of the blood pressure monitoring in the table below.

blood pressure data

The results are interesting because obviously, my blood pressure went down right after exercise and dropped more 15 minutes after. Upon waking up, my blood pressure readings show that my systolic reading  indicating the maximum arterial pressure is higher than the normal 120mm Hg but the systolic reading is normal. After exercise, the systolic pressure reduced greatly just by visual inspection even while the heartbeat is high. After 15 minutes, the systolic and diastolic readings even went down further while my heartbeat approximates its normal value.

So are the results conclusive enough that exercise lowers blood pressure? There is no doubt exercise lowers blood pressure[1] but I have not seen details on how much blood pressure is reduced by exercise. This data informs me right away the benefits of exercise and serve as an encouragement to engage and maintain my exercise routine.

Today, when I went for my usual six kilometer run in 41 minutes and 38 seconds (the fastest so far in that distance), my blood pressure after exercise approximates the previous values. It is 104/65 with a heartbeat of 95. Again, after my heartbeat stabilized 15 minutes later at 64, while my corresponding blood pressure is 101/59.

Data Collection Procedure for the Exercise Experiment

How did I come up with such values? What is the data collection procedure? I collected this data systematically, making consistent readings as much as I can. Roughly, the data collection procedure goes this way and can be replicated by anybody.

Record BP and heartbeat –> stretching exercise for five minutes –> slow walk of 8 minutes –> run proper –> cooling down with a slow walk for about 15 minutes –> five-minute stretching –> record BP and heartbeat –> rest for 15 minutes –> record BP and heartbeat

I used an Omron Automatic Wrist Blood Pressure Monitor in making blood pressure readings. It read a little higher than the standard sphygmomanometer but it reads consistently. So it will be easy to calibrate it for more standardized results.

From this exploratory study which confirms the benefits of exercise in a quantitative way, a research design may be developed for more rigorous analysis. You should notice, however, that sleep may also have an influence on blood pressure readings so I marked the third reading with an asterisk. The quality of my sleep in the first two readings is not that good as I only had six or less hours of sleep while on the third reading, I got quality sleep of seven hours or more. This apparently resulted to lower blood pressure readings upon waking up.

This means that if I pursue this experiment, I should make my measurements consistent and consider the hours of sleep and factor it in for analysis. I should also make sure that monitoring time should be the same all throughout the duration of the study.

Now, the question is: “Are there studies conducted like this before?” I actually don’t know as in truth I am not a medical researcher. At best, my experience is only a case study; a description of my case. But a review of literature will tell me if a similar experiment was done by anyone on a greater number of people. These experiments are guided by different theories on the effects of exercise to health developed through time. In my case, I just did it out of mere curiosity to verify my readings which are also backed by theories.

From these initial data, an experimental research design may be developed to ensure that the evidence obtained answers the questions initially posed for the study. Two questions were posed in these two examples: 1) Can a well-planned diet produce the same results as a drug in lowering blood pressure?, and 2) Does exercise lower blood pressure?

From simple case studies like these, experiments may be designed to test if the findings are consistent for a greater number of people. This will also provide insights on which variables should be included for analysis.

Reference

1. Mayo Clinic (n.d.). Exercise: A drug-free approach to lowering high blood pressure. Retrieved August 27, 2013, from http://www.mayoclinic.com/health/high-blood-pressure/HI00024

© 2013 August 26 P. A. Regoniel

How to Conduct a Focus Group Discussion

How do you extract useful information from a group of people in connection with your research? One of the tools used is focus group discussion. Read on to find out how this is done.

If you engage in social research or study research methodologies, one of the common (sometimes abused) methods of data collection that you should be familiar with is focus group discussion or FGD. Aside from soliciting ideas that will help answer or narrow down your research topic, the output of the discussion verifies or confirms the results of surveys designed to answer research questions that you are interested in (see Triangulation).

What is focus group discussion, when do you use it and how should you conduct it? What good practices should be observed? This article provides answers to these questions.

Definition of Focus Group Discussion

Sometimes, FGD is also called focused group discussion because the discussion focuses on questions that seek multi-stakeholder response. It may also refer to the ‘focus group,’ that is, those who are found relevant to take part in tackling the issues raised by the researcher.

Essentially, FGD is a discussion of issues and concerns between a selected group of four to eight people. It serves as a venue to confirm and verify the participants’ viewpoints and draw out their experiences so that they are able to build a consensus about the research topic. A well-trained moderator guides the progress of the discussion using a set of questions prepared by the researcher.

How are the Participants of FGD Selected?

The participants of the FGD are selected using a set of guidelines or criterion such that the participants are able to give useful or relevant information to meet the objectives of the study. This requires familiarity with the background of the participants. It is, therefore, common practice that managers, leaders of the community or those who have lived in the community for a while or someone familiar with the business of an organization are consulted before conducting an FGD.

For example, in a study on coastal resource use, if the issue relates to dynamite fishing in the coral reefs, the participants to look for include representatives from the groups of fishers, fish traders, former dynamite fishers, law enforcers, explosives suppliers, local policy makers, non-government organizations or associations, among others who have direct or indirect transactions in the community. Avoid bias in the selection of participants such as including only those who are accessible or favoring a certain political group.

Example FGD Questions

Examples of questions that relate to illegal fishing in the coral reefs that will serve as the focus of the FGD are the following:

  • What are the target fish species of the illegal fishers?molotov cocktail
  • How do the illegal fishers get their explosives?
  • How much do the dynamite fishers earn from their activity?
  • Why do the dynamite fishers engage in this illegal activity despite prohibition?
  • What are the risks associated with dynamite fishing?
  • At what time of the day and how frequent do the dynamite fishers go out to fish?
  • Where are the dynamite fishers coming from?

Of course, the questions will ultimately depend on what information you would like to draw out from the participants. The FGD enables you to explore which variables you will include and focus the quantitative (if ever) part of your study.

You might want to relate fisher income with frequency of dynamite fishing. Or you might want to quantify the costs and benefits of dynamite fishing (taking the point of view of the fisher). The end justifies the means, so they say.

How to Conduct the FGD

The following are needed resources to conduct a focus group discussion.

Human resources

  1. A trained moderator or facilitator. The moderator may not necessarily be the researcher himself but someone familiar with the issues to be discussed. Hence, he should confer with researcher before conducting the FGD process. He should have a good background knowledge of the participants and must not involve himself in the discussion, such as arguing with the participants. His main role is to introduce and explain the questions, clarify issues raised, confirm responses, encourage expression of ideas, among other related functions. He summarizes the process at the end of the discussion.
  2. A note taker. The note taker records the progress of the FGD. He does not only list the oral expression of ideas of the participants but also their actuations or non-verbal expressions. He clarifies points once in while by getting the moderator’s attention on points that are not clear. He furnishes a copy of the transcripts to the participants as a matter of transparency.

The quality of information gathered through the FGD depends to a large extent on the skill and keenness of the moderator and the note taker. For best results, rapport between the researcher’s group and the participants should be made such that the participants will not inhibit themselves from freely expressing their ideas.

Materials

The following materials should be made available during the conduct of the FGD:

  1. Recording material. The standard note pad and pencil or pen must always be available. Although laptops, tablets, cameras, MPEG recorder, or cameraanything electronic will work in an urban environment, a different situation exists in FGDs conducted in far-flung areas. Although these gadgets may be used to record data in the field, these are prone to many problems such as low batteries, broken during the trip, got submerged and damaged while wading a river, among others. If electronic data recording equipment is really desired, then these should be weather and/or shock resistant.
  2. Group memory. Group memory is something that the participants can refer to as the discussion takes place. The participants focus their attention towards this attention-getting list of questions and responses. This could be a set of Manila paper with pre-written questions, a whiteboard or blackboard, or a mini-projector if you may.
  3. Attendance sheet. If you do research for somebody (say as a consultant) or in compliance with your thesis requirement, you need this because it serves as evidence that you really did the FGD. This will also help you find your respondents if you will need to go back and clarify points.
  4. Global Positioning System (GPS). This will aid you in locating the place where you did the FGD. This is good information to those who would like to make a follow-up study in a similar place.

The information derived from the FGD, aside from fulfilling an academic requirement, is useful in policy making and management. It can lead to agreement on certain controversial issues and evaluation of program or project accomplishments in the target community.

Reference:

International Institute of Rural Reconstruction (1998). Participatory methods in community-based coastal resource management.

© 2013 August 14 P. A. Regoniel

How to Reduce Researcher Bias in Social Research

In conducting research, being partial can lead to faulty conclusions. This tendency is conveniently called bias.

How can a researcher avoid committing this blunder? This article explains what bias is and suggests ways on how to reduce it.

One of the important considerations in research involving people’s response (i.e., social research) is to reduce or eliminate researcher bias. If a researcher conducts the investigation in a biased manner, research outcome becomes inaccurate and unreliable.

If the results of a study are unreliable, inaccurate or invalid, questions will arise on many fronts and doubts regarding the conclusions of the study will compromise its publication in a refereed journal. In businesses, decision-making based on the results and conclusions of the study will be faulty thus lead to profit loss or inability to solve organizational or operational problems. Bias can distort the truth.

But what causes bias and how can you as a researcher avoid it?

Definition of Bias

A relevant definition of bias in the Bing dictionary states thus: “bias is an unfair preference for or dislike of something.” In the context of research, this means that the researcher does something that favors or skews towards a certain direction. The researcher may deliberately or inadvertently commit it.

Causes of Bias and Suggested Solutions

While it will be difficult to eliminate all sources of bias in the conduct of your research totally, being aware of the following common pitfalls in the practice of research is desirable:

1. Personal convenience in data collection

Many of those who conduct research fail to do good research because they want to do it at their convenience. For example, instead of getting a random sample of respondents, he may just interview anyone that gets in his way. This approach is not an objective way of getting a good sample from a given population of the study.

Suggested Solution:
  • Ask yourself the question: “Am I doing this part of the research for my personal convenience?” If you are, then recognize that this will introduce bias and reduce research quality.
  • Select respondents randomly.

2. Favoring your own stand

While the nature of your research may be argumentative, favoring a preconceived position on the subject you are investigating will cause bias in your results. You will have the tendency to steer the results of your study to the direction that you want.

Suggested Solution:
  • Stick to what your data shows.
  • Do not manipulate the results.

3. Inadequately prepared questionnaires

Inadequately prepared questionnaires can lead to many biases in the results of your study. Make sure that you follow good practices in preparing questionnaires.

Suggested Solution:
  • Read reliable references on how to prepare questionnaires objectively.
  • Once you are ready with your questionnaire, validate it to a non-respondent group.
  • Make changes when appropriate.

This situation means that you should take enough time to prepare your questionnaire. It is better to start right than having doubts about the reliability, validity or accuracy of your data later on.

4. Faulty Data Collection Procedure

Watch out for unreliable answers to the questions that you posed. It is possible that the following occurred during the data collection:security question

Image Source

  1. respondents are agitated, so they answered quickly without pondering the questions
  2. you asked leading questions or questions that make people choose because you mentioned the choice
  3. you answered the questionnaire for your respondent because of your impatience in waiting for the response
  4. the questions are not stated clearly
  5. the Respondent has no experience on the subject you are asking about
  6. other parties were present at the time of interview that influenced respondent’s ideas or thoughts
  7. the questions are too long such that the respondent gets tired of having to respond to those questions
  8. respondents fear that their answers may incriminate the
Suggested Solutions:
  1. make sure that the respondents are ready and willing to be interviewed
  2. don’t provide the choices to the respondents in personal interviews
  3. be patient in administering questionnaires
  4. ensure that the questions are clear enough
  5. evaluate the capability of the respondent in answering the questions
  6. conduct personal interviews in places not within the hearing distance of others
  7. prepare questionnaires that will be finished in a reasonable amount of time (some suggest 30 minutes is great)
  8. explain to your respondent that their answers will not be held against them

5. Unverified Information

Sometimes, researchers rely on just one source of information in making their conclusions. This practice will be full of biases. Triangulation may be employed to avoid this pitfall. This was discussed before in the post titled Data Accuracy, Reliability and Triangulation in Qualitative Research.

The whole point of this discussion is that the researcher should take all precautions against doing things that may impact negatively or threaten data accuracy and reliability. A researcher, therefore, must be neutral and objective-minded in carrying out his study.

© 2013 August 12 P. A. Regoniel

What is a Model?

In the research and statistics context, what does the term model mean? This article defines what is a model, poses guide questions on how to create one and provides simple examples to clarify points arising from those questions.

One of the interesting things that I particularly like in statistics is the prospect of being able to predict an outcome (referred to as the independent variable) from a set of factors (referred to as the independent variables). A multiple regression equation or a model derived from a set of interrelated variables achieves this end.

The usefulness of a model is determined by how well it is able to predict the behavior of dependent variables from a set of independent variables. To clarify the concept, I will describe here an example of a research activity that aimed to develop a multiple regression model from both secondary and primary data sources.

What is a Model?

Before anything else, it is always good practice to define what we mean here by a model. A model, in the context of research as well as statistics, is a representation of reality using variables that somehow relate with each other. I italicize the word “somehow” here being reminded of the possibility of correlation between variables when in fact there is no logical connection between them.

A classic example given to illustrate nonsensical correlation is the high correlation between length of hair and height. It was found out in a study that if a person has short hair, that person tends to be tall and vice-versa.

Actually, the conclusion of that study is spurious because there is no real correlation between length of hair and height. It so happened that men usually have short hair while women have long hair. Men, in general, are taller than women. The true variable behind that really determines height is the sex or gender of the individual, not length of hair.

At best, the model is only an approximation of the likely outcome of things because there will always be errors involved in the course of building it. This is the reason why scientists adopt a five percent error standard in making conclusions from statistical computations. There is no such thing as absolute certainty in predicting the probability of a phenomenon.

Things Needed to Construct A Model

In developing a multiple regression model which will be fully described here, you will need to have a clear idea of the following:

  1. What is your intention or reason in constructing the model?
  2. What is the time frame and unit of your analysis?
  3. What has been done so far in line with the model that you intend to construct?
  4. What variables would you like to include in your model?
  5. How would you ensure that your model has predictive value?

These questions will guide you towards developing a model that will help you achieve your goal. I explain in detail the expected answers to the above questions. Examples are provided to further clarify the points.

Purpose in Constructing the Model

Why would you like to have a model in the first place? What would you like to get from it? The objectives of your research, therefore, should be clear enough so that you can derive full benefit from it.

In this particular case where I sought to develop a model, the main purpose is to be able to determine the predictors of the number of published papers produced by the faculty in the university. The major question, therefore, is:

“What are the crucial factors that will motivate the faculty members to engage in research and publish research papers?”

Once a research director of the university, I figured out that the best way to increase the number of research publications is to zero in on those variables that really matter. There are so many variables that will influence the turnout of publications, but which ones do really matter? A certain number of research publications is required each year, so what should the interventions be to reach those targets?

Time Frame and Unit of Analysis

You should have a specific time frame on which you should base your analysis from. There are many considerations in selecting the time frame of the analysis but of foremost importance is the availability of data. For established universities with consistent data collection fields, this poses no problem. But for struggling universities without an established database, it will be much more challenging.

Why do I say consistent data collection fields? If you want to see trends, then the same data must be collected in a series through time. What do I mean by this?

In the particular case I mentioned, i. e., number of publications, one of the suspected predictors is the amount of time spent by the faculty in administrative work. In a 40-hour work week, how much time do they spend in designated posts such as unit head, department head, or dean? This variable which is a unit of analysis, therefore, should be consistently monitored every semester, for many years for possible correlation with the number of publications.

How many years should these data be collected? From what I collect, peer-reviewed publications can be produced normally from two to three years. Hence, the study must cover at least three years of data to be able to log the number of publications produced. That is, if no systematic data collection was made to supply data needed by the study.

If data was systematically collected, you can backtrack and get data for as long as you want. It is even possible to compare publication performance before and after a research policy was implemented in the university.

Review of Literature

You might be guilty of “reinventing the wheel” if you did not take time to review published literature on your specific research concern. Reinventing the wheel means you duplicate the work of others. It is possible that other researchers have already satisfactorily studied the area you are trying to clarify issues on. For this reason, an exhaustive review of literature will enhance the quality and predictive value of your model.

For the model I attempted to make on the number of publications made by the faculty, I bumped on a summary of the predictors made by Bland et al.[1] based on a considerable number of published papers. Below is the model they prepared to sum up the findings.

model on faculty publications
Bland et al.’s Model of Research Productivity

Bland and colleagues found that three major areas determine research productivity namely, 1) the individual’s characteristics, 2) institutional characteristics, and 3) leadership characteristics. This just means that you cannot just threaten the faculty with the so-called publish and perish policy if the required institutional resources are absent and/or leadership quality is poor.

Select the Variables for Study

The model given by Bland and colleagues in the figure above is still too general to allow statistical analysis to take place. For example, in individual characteristics, how can socialization as a variable be measured? How about motivation?

This requires you to further delve on literature on how to properly measure socialization and motivation, among other variables you are interested in. The dependent variable I chose to reflect productivity in a recent study I conducted with students is the number of total publications, whether these are peer-reviewed or not.

Ensuring the Predictive Value of the Model

The predictive value of a model depends on the degree of influence of a set of predictor variables on the dependent variable. How do you determine the degree of influence of these variables?

In Bland’s model, all the variables associated with those concepts identified may be included in analyzing data. But of course, this will be costly and time consuming as there are a lot of variables to consider. Besides, the greater the number of variables you included in your analysis, the more samples you will need to obtain a good correlation between the predictor variables and the dependent variable.

Stevens[2] recommends a nominal number of 15 cases for one predictor variable. This means that if you want to study 10 variables, you will need at least 150 cases to make your multiple regression model valid in some sense. But of course, the more samples you have, the greater the certainty in predicting outcomes.

Once you have decided on the number of variables you intend to incorporate in your multiple regression model, you will then be able to input your data on a spreadsheet or a statistical software such as SPSS, Statistica, or related software applications. The software application will automatically produce the results for you.

The next concern is how to interpret the results of a model such as the results of a multiple regression analysisl. I will consider this topic in my upcoming posts.

Note

A model is only as good as the data used to create it. You must therefore make sure that your data is accurate and reliable for better predictive outcomes.

References:

  1. Bland, C.J., Center, B.A., Finstad, D.A., Risbey, K.R., and J. G. Staples. (2005). A Theoretical, Practical, Predictive Model of Faculty and Department Research Productivity. Academic Medicine, Vol. 80, No. 3, 225-237.
  2. Stevens, J. 2002. Applied multivariate statistics for the social sciences, 3rd ed. New Jersey: Lawrence Erlbaum Publishers. p. 72.

Data Accuracy, Reliability and Triangulation in Qualitative Research

As a researcher, you might want to make sure that whatever information you gather in the field can be depended upon. How will you be able to ensure that your data is accurate and reliable? This article explains the importance of verifying information through a technique called triangulation.

Data Accuracy and Reliability

Do you know what the GIGO rule is? GIGO is acronym for Garbage In, Garbage Out. This rule was popular in the early periods of computer use where whatever you input into the computer is processed without question.

Data accuracy and reliability are very important concerns in doing good research because inaccurate and unreliable data lead to spurious or wrong conclusions. If, for some reason, you inadvertently input wrong data into the computer, output will still be produced. But of course, the results are erroneous because the data entered is faulty. It is also possible that you input the data correctly but then the data does not reflect what you really want to measure.

Thus, it is always good practice to review whatever data you have before entering it into your computer through a software application like a spreadsheet or a statistical software. Each data should be verified for accuracy and must be input meticulously. Once entered, the data, again, must be reviewed for accuracy. An extra zero in whatever number you entered in a cell will affect the resulting graph or correlation analysis. Or data input into the wrong category can destroy data reliability.

This data verification strategy will work for quantitative data which are obtained mainly through the application of standardized measurement scales such as nominal or categorical, ordinal, interval, and ratio. The latter two measurements offer the most accurate measurement scales by which the data obtained will allow for sound statistical analysis. Although measurement data will vary between observers as some researchers apply a meticulous approach to what they are doing while some do it casually, the errors of measurement can be controlled to a certain degree.

In the case of qualitative research, which in nature is highly subjective, there are also ways by which data can be verified or validated. This is through the so-called triangulation method.

What is the Triangulation Method?

Triangulation is one of the popular research tools that researchers commonly use in an attempt to verify the accuracy of data obtained from the field. As the word connotes, it refers to the application of three approaches or methods to verify data.

Why three? This works just like a global positioning system or GPS where you need at least three satellites to tell you your exact location. Simply put, this just means that you need not only one source of information to provide answers to your questions. And at least three should be put to practical use.

At best, the questions you pose in qualitative research represent people’s viewpoints, and these viewpoints should be verified through other means. If it so happened that you have only one source of information and that information is false, then that becomes 100% erroneous. Consequently, your conclusions are faulty. Having several information sources give researchers confidence that the data they are getting approximates the truth.

Data

Image Source

Methods of Triangulation in Qualitative Research

The most common methods used as a demonstration of triangulation are the household interview or HHI, key informant interview (KII), and focus group discussion (FGD). These approaches rely on the information provided by a population of respondents with a predetermined set of characteristics, knowledgeable individuals, and a multi-sectoral group, respectively.

HHI utilizes structured questionnaires administered by trained interviewers to randomly selected individuals, usually the household head as the household representative. It is a rapid approach to getting information from a subset of the population in an attempt to describe the characteristics of the general population. The data obtained are largely approximations and highly dependent on the honesty of the respondents.

Second, the KII approach obtains information from key informants. A key informant is someone who is expected to be well-familiar with issues and concerns besetting the community. Almost always, the key informants are elders or someone who had lived the most and familiar with community dynamics or changes in the community through time.

Third, FGD elicits responses from representatives of the different sectors of society. These representatives are usually called the stakeholders, meaning, they have a stake or are influenced by whatever issue or concern is being investigated. Fishers, for example, are affected by the establishment of protected areas in their traditional fishing grounds.

Conclusion

Data accuracy is threatened by the inherent subjectivity of data obtained through qualitative methods. Therefore, a combination of qualitative methods such as household interview, key informant interview, and focus group discussion can reduce errors and provide greater confidence to researchers employing qualitative approaches. This is referred to as triangulation.

Reference:

Janssen, C. n.d. Garbage In, Garbage Out (GIGO). Retrieved on July 28, 2013 from http://www.techopedia.com/definition/3801/garbage-in-garbage-out-gigo

© 2013 July 28 P. A. Regoniel

What is the Difference Between Benchmarking and Baselining?

What is the difference between benchmarking and baselining in view of applying these concepts in research? Read and note the difference.

While benchmarking and baselining are very common terms used by many people, the usefulness and relevance of these terms in  research are not well discussed. I say this because when I did try to look for the meaning of these words online, definitions either refer to business or to computers.

This article, therefore, aims to differentiate one from the other and to highlight the importance of understanding these two terms in view of doing good research.

What is benchmarking?

bench

Benchmarking is the process of comparing something — performance, practices, cost, quality, among others — with a standard. A standard, in this sense, is a desirable state for doing something. It could be a model to emulate or look up to.

Benchmarking is usually applied in businesses for management purposes. An organization might want to compare itself with another organization that performs better than itself. It is with the end in view of being able to compete or at the very least, be at par with the best in the industry. The best in the industry are those organizations that perform so-called best practices that can help 1) attain quality in whatever product they may be producing, 2) save time, or 3) minimize cost. These are not, however, the only benefits that can be gained from benchmarking.

The relevance of benchmarking in doing research

First and foremost, a set of measurable indicators must be identified before making comparisons between and across subjects or entities to be compared. Without indicators, there will be no way of comparing, say, work performance, between the model organizations and the one aspiring to do better. The indicators that serve as the units of comparison, therefore, should be similar. You cannot compare the height of  individuals using inches in one and centimeters in the other. You have to standardize using just one measurement scale.

Indicators are similar to variables that researchers have to contend with for them to be able to make an objective comparison. If indicators are not measurable, the only way to make comparisons is through subjective judgement, that can be done by experts. Subjective judgement, however, is unscientific and is prone to erroneous conclusion.

To be more specific and to make clear the idea of benchmarking, consider the following scenario:

A newly established university (NEU) wants to find out if it competes well with other universities in terms of the annual number of publications its faculty produces. Specifically, the administration would like to see an increase in the number of refereed publications produced each year.

The university president sends its research staff to three well-known universities in the country to find out the best practices employed by those universities, such that they are able to deliver a lot of publications without sacrificing quality. Once these best practices are identified and applied to the aspiring university, the president would want to know how the faculty performed.

What is the indicator for comparison in this case? Obviously, it is the number of refereed publications. It would take years for best practices to take effect in the aspiring university, so the comparison may be made after, say, 10 years. After that period has elapsed, data on the number of refereed publications from the model university may be compared with the NEU.

What is baselining?

Baselining is different from benchmarking in the sense that the comparison is made between groups or organizations before and after intervention has been made. But this is not comparing them against an outside entity but with itself. Measurable indicators, in this instance, is still important in order to find out if considerable progress has been made.

In the example given above, the university may compare its past performance, i.e., the number of refereed publications for the last 10 years, with its performance in the next 10 years after implementing interventions based on best practices in model universities. There is a historical element in this comparison.

A baseline study may be used in evaluating government policy such as the performance of the number coding scheme to regulate vehicle traffic and reduce air pollution. This can prevent a costly and ineffective hit-and-miss approach to policy making.

References:

International Business Machines. (n.d.). Baselining projects. Retrieved from http://pic.dhe.ibm.com/infocenter/synhelp/v7m2r0/index.jsp?topic=%2Fcom.ibm.rational.synergy.manage.doc%2Ftopics%2Fs_t_bmg_baselining_projs.html

Management Analysis and Development. (n.d.).  What is benchmarking? Retrieved from http://www.mad.state.mn.us/benchmarking

The Benchmarking Exchange. (n.d.). What is benchmarking? Retrieved from http://www.benchnet.com/wib.htm

© 2013 July 18 P. A. Regoniel