Sunday, April 20, 2014

Collaboration Between Team Mates

Manderson et al
At the beginning of the article, Manderson and his co-authors made the distinction between qualitative research instruments and quantitative research instruments. They mentioned that for well-designed quantitative research, as long as instruments are reliable and interpretations are valid, there are limited opportunities for various interpretations. For qualitative studies, This may not be true, as the advantage of qualitative study is its richness; therefore, interpretations are situational and hard to “transfer” between researchers and situations. However, this would raise issues of validity, reliability and interpretability of qualitative data. Different methods of interpreting and using raw data would produce different qualitative results. Therefore, in this article, Manderson and his co-authors proposed a method called “structured method of analysis”, which is featured with using highly structured methods of analysis, with “explicit codes and categories”. In this way, qualitative data which involves  interpretation was partially transformed to quantitative studies. The pitfall of this method would be to raise an issue of de-contextualizing qualitative data. Ross described one alternative way to solve this problem: sorting the files into categories for further analysis.
Another issued caused by multiple researchers in a team is the difficulties of assuring confidentiality of data. This might require the principle investigator to assure that every researcher have IRB certificate cleared.

Sin:
As a continuum of Manderson’s notion of “structured method analysis”, Sin described an evaluation study using mixed method, trying to quantifying qualitative data. However, improving the validity of interpretations, quantifying qualitative data is not enough.  In the article, Sin reflected that it is necessary to have clear codes if the project involves collaborations in a team. However, this doesn’t mean that codes will not be changed after team members dig into the data. Therefore, researchers would adopt the way of pilot coding to test if one coding tree is reliable enough. The coding process described in Sin’s work shared many similarities with my research experiences. Given the collaborative feature in my group, it is quite likely that preliminary and ambiguous coding schemes, especially, at the beginning would cause lots of confusions and varying understandings between collaborators. Therefore, we tend to have very clear coding books developed together with the coding scheme. This appears to be an effective way to reduce ambiguity among researchers. After we had clear descriptions on the code book, we would move to pilot coding. What we usually do in our group is to have 10% of data and code based on the preliminary coding scheme. This process would involve with adding more “grounded based” codes emerged from the data.
The way of software use is determined by conditions rather than by the function of the software. This relates to the issue that software doesn’t analyze the data, but people do. Another expectation that is behind software use is that computer tools will not analyze the data quickly as people expect.
N6 enables researchers to combine quantitative methods and qualitative method together. However, researchers don’t have settled answers concerning what is being quantified and what types of quantitative analysis is more appropriate.

Barry:

Being different than other two studies which tried to quantify qualitative data, aiming for improving reliability among researchers in a group, Barry approached this problem from the perspective of reflexivity practices. Reflexivity is defined as the awareness of the researcher’s own presence in the research process. The major part of this study described how Barry’s team constructed reflexive accounts. Two tools were used in this study, one is an narration recording individual researchers’ reflexivity and the other was about definitions of key theoretical concepts. The first tool involved with sharing their positions and their preferred theoretical framework. The second process involved with reflections on theoretical stances and how would those guide study designs. 

Thursday, April 17, 2014

Reflections After Dr. Paulus Talk

Personally, I really enjoyed Dr. Paulus’s presentation on Tuesday. She gave us a more holistic view of carrying out one project in ATLAS. I like the point she made at the beginning that analytical tools don’t do the analysis for you. Researchers are still the persons who carry out the analysis. Moreover, she emphasized that using ATLAS doesn't associate with adopting grounded theory. ATLAS does support coding and segmenting, but this doesn’t mean researchers need to use ATLAS for coding, segmenting, and retrieving. Sometimes, researchers do coding and segmenting aiming for organizing their data well. As a new user of ATLAS, I use it for data organizing process a lot.

Dr. Paulus mentioned that using analytical tools would make research process more transparent. I already experienced this when I was doing my skill builder assignment where I was trying to integrate analytical tools into the case study I have been working on. By using analytical tool, although I need to locate evidences from each data source to triangulate my results by myself, linking evidences in multiple files became easier and transparent. Being transparent means easier to track back in the future.


 Dr. Paulus noted that ATLAS also supports literature review. I am excited to explore this more, maybe, in my newsletter column assignment. This is really important for qualitative researchers. For literature review, there is no need to review literatures that are not relevant to the study. However, to join into the conversation of researchers in the field, one need to show how does the the current study relates/solves issues in the literature. Using the function of linking multiple files in one project, connecting literature review with the analysis process becomes easier. 

Sunday, April 13, 2014

Debates of Qualitative Tools

In the section of size of database, Taylor and her co-authors mentioned that it was not qualitative research tools that lead to the trend of collecting large data. Sometimes, it is the research project which aims for having large dataset. This is quite true and seen in my daily research experiences. Our project adopted a concept called design based research, through which we assessed students' learning not only from growth shown in pre-to-posttest, but we are more cared about their learning process and treat classroom as an ecology. This requires us to have multiple sources of data, such as students’ participation pattern, students’ artifacts, videos, which record their learning experiences. Having large data set doesn’t necessarily mean we would adopt a shallow analytical method. We used pre-test and posttests to examine students’ overall learning pattern and process data, such artifacts and classroom video data to unveil the journey students took to arrive the destination. The qualitative research tools do a nice job to organize different kinds of data and link between them, which would be time-consuming if there had been no analytical tools. However, it is still researchers who do analytical work. Analytical tools perform clerical and technical tasks.
Another feature that qualitative tools amaze me most is it would be easier to add new codes to existing coding system if it is necessary. Comparing to the tedious work in manual coding, going back to raw data as new codes been added is not uncommon. However, using qualitative tools would expedite this process.
I agree that qualitative research tools make the analytical process more transparent. This is especially true when researchers need to link multiple datasets or make sense of a large dataset. For example, without tools, either I need to remember the linkage between files or I need to write them down. Given the importance of triangulation of data source, linking between files is important. 

I am not worried that qualitative research tools could create distance between researchers and data. although using qualitative research tools tend to popularize one analytical method, coding, sometimes, doing coding and segmenting is a first step towards doing more advanced analysis. Therefore, technological tools should not be blamed. If one researcher wants to adopt simple "coding and counting" as his/her method, s/he would do that without technology. 

Thursday, April 10, 2014

ATLAS.ti

I want to reflect the features of ATLAS and how they relate to the reflexive nature of qualitative research. ATLAS supports preliminary coding activities, which is called free codes in ATLAS. This is an important feature for qualitative researchers. Reflecting my experiences of composing coding schemes, having a place to record preliminary codes are important and helpful. This might be especially true for researchers who based their analysis on grounded theory. Another feature of ATLAS that is supportive for the coding process is researchers don’t need to have the relationships among codes ready before they dig into the data. After they take a look at the data and have codes developed, they could organize relationships among codes by assigning different networks ATLAS provides.

Besides, as I mentioned in my skill builder, comparing with other qualitative research tools, ATLAS supports juxtaposing four documents within one project, which support data triangulation for qualitative researchers.


Another feature I loves ATLAS most is it connects codes closely with documents. For video coding, it positions codes with video closely. Although dedoose highlights documents with different colors based on different codes, dedoose doesn’t position them closely. This made data tracking back especially difficult.