Analyzing the Sources

From CNM Wiki
Revision as of 23:18, 16 August 2020 by Gary (talk | contribs)
Jump to: navigation, search

Analyzing the Sources (hereinafter, the Session) is a hands-on session designed to get its participants started analyzing the data relevant to WorldOpp. The Session is the seventh of ten sessions of CNM Cyber Placement (hereinafter, the Seminar).

Analyzing and verifying data sources and data


Outline

Discovering the Data is the predecessor session.

How to verify data sources

Verification of data sources in this case refers to determining whether the source of the data presented is credible or not. This is very important in cases where the learner is using external sources of information in their learning path.

Data is divided into primary and secondary sources. 

Primary sources refer to data in its raw form. This means that the researcher went to the field and collected this data and as such it is not available elsewhere. Secondary sources of data refer to books, periodicals or other forms that have been used to store and preserve the data collected. A researcher will only go to the field to collect primary data if the available secondary data cannot be used in the research .

CNM wiki makes use of secondary sources of data to explain concepts to the learners. This is captured in lectures, videos and other learning materials.

To verify a data source, the candidate needs to the source of the data in question and determine if it is an authoritative source before consuming the data as 'gospel truth'. for instance, if one wanted to get information about Michael Jackson, he or she would get more credible information form hi s biography or website rather than from sensational media which normally exaggerate issues to make a sensational story.

How to analyze data sources

There are 6 ways to evaluate the credibility of a data source

  1. Author - When reviewing data found online, it is important to find out who the author of the resource is. While peer reviewed journals and scholarly articles will clearly show this, some sites shy away from attributing the content to a specific writer. Users should use sources without author details carefully and get second sources to confirm the data. In analyzing the author details, the user should also determine whether the author is a subject expert in that field their academic qualifications and whether they are biased or not.
  2. Citation of the primary sources of data used. For a source to be considered credible, it should cite the origin of the data contained therein. Sources such as books should contain information on other sources that collaborate the data and provide links that he user can go to to confirm this information.
  3. In the case where the user is using the web for research, priority should be given to domain names in the following order
    1. Education: .edu
    2. Academic: .ac
    3. Government: .gov
    4. Organisation: .org
    5. Commercial: .com
    6. Network: .net

It is highly likely that a source with the extension '.edu' would have more credible information than one contained in a '.com' extension. Be careful with the domain .org, because .org is usually used by non-profit organizations which may have an agenda of persuasion rather than education.

  1. Writing Style – Poor spelling and grammar are an indication that the site may not be credible. In an effort to make the information presented easy to understand, credible sites watch writing style closely.

How to verify data

Data verification is important since it informs the users conclusion on a given concept. To verify data, the learner should evaluate the following

  • Style of writing
  • Date the Information was published- this will allow the Learner to determine whether the information contained therein is recent enough for purposes of research.
  • Relevant
  • Scope
  • Target audience

Finally, Always look for a second and third source of data to confirm accuracy of the information as opposed to just using the first source.

How to analyze data

Data analysis can be described as the process of manipulating and modelling data to discover useful information that would be used to support decision making making and conclusion. Data analysis is multi-faceted and employs different tricks depending on the domain used. For instance data analysis in Binsess uses different tools compared to analysis in social sciences. In data analysis, tools such as Visio, Matlab and Stata are very helpful. In most cases Microsoft Exel come in handy in making presentations and visualizing the data

Steps in Data analysis.

  1. Defining the objectives - the user needs to understand what questions the data should answer. This is an important step that explains how the data will will be manipulated and visualized. the objectives for the analysis should be measurable, clear and concise. For example a business owner in a start up company may find himself making losses and not able to meet financial obligations as and when the fall due. Sample objectives in his research would be whether he can reduce the workforce without compromising on quality? whether he can get longer terms sources of funding without increasing the cost of capital? etc.
  2. Deciding on what to measure and how to measure it - With the objectives i place, the business owner in the above example can decide what to measure and how to measure it. Fro example, he could measure how much work (productivity) is done by one person and with this determine how many employees would be optimum with regard to existing resources and needs. He could also compare the costs of capital from different sources and the terms of repayment
  3. Data Gathering - Data collection may make use of all or some of the methods listed below.
    1. Use of secondary data
    2. Conducting interviews
    3. Administering questionnaires
    4. Sampling
    5. Observation
  4. Data Scrubbing - This involves amending the data and removing incorrect and superfluous data in the set. It is important to scrub and cleanse data since 'dirty' data will affect the overall process and have an impact on the decision made. It should however be noted that data scrubbing is not a substitute for getting ' dirty data' to begin with.
  5. Data Analysis -
  6. Data Interpretation and presentation -


The researcher should be able to select and defend an appropriate method of data collection.

Composing the Documents is the successor session.

Materials

Recorded audio

Recorded video

Live sessions

Texts and graphics

See also