Skip to main content

TYPES OF DATA

  • In order to understand the nature of data it is necessary to categorize them into various types. 
  • Different categorizations of data are possible. 
  • The first such categorization may be on the basis of disciplines, e.g., Sciences, Social Sciences, etc. in which they are generated. 
  • Within each of these fields, there may be several ways in which data can be categorized into types.
There are four types of data: 
  1. Nominal 
  2. Ordinal 
  3. Interval 
  4. Ratio 
Each offers a unique set of characteristics, which impacts the type of analysis that can be performed.

The distinction between the four types of scales center on three different characteristics:
  1. The order of responses – whether it matters or not
  2. The distance between observations – whether it matters or is interpretable
  3. The presence or inclusion of a true zero
Nominal Scales
Nominal scales measure categories and have the following characteristics:
  • Order: The order of the responses or observations does not matter.
  •  Distance: Nominal scales do not hold distance. The distance between a 1 and a  2 is not the same as a 2 and 3
  • True Zero: There is no true or real zero. In a nominal scale, zero is uninterruptable
Appropriate statistics for nominal scales: mode, count, frequencies Displays: histograms or bar charts

Ordinal Scales 
At the risk of providing a tautological definition, ordinal scales measure, well, order. So, our characteristics for ordinal scales are: 
  • Order: The order of the responses or observations matters. 
  • Distance: Ordinal scales do not hold distance. The distance between first and second is unknown as is the distance between first and third along with all observations. 
  • True Zero: There is no true or real zero. An item, observation, or category cannot finish zero.
Appropriate statistics for ordinal scales: count, frequencies, mode Displays: histograms or bar chart

Interval Scales 
Interval scales provide insight into the variability of the observations or data. Classic interval scales are Likert scales (e.g., 1 - strongly agree and 9 - strongly disagree) and Semantic Differential scales (e.g., 1 - dark and 9 - light). In an interval scale, users could respond to “I enjoy opening links to thwebsite from a company email” with a response ranging on a scale of values. 
The characteristics of interval scales are: 
  • Order: The order of the responses or observations does matter. 
  • Distance: Interval scales do offer distance. That is, the distance from 1 to 2 appears the same as 4 to 5. Also, six is twice as much as three and two is half of four. Hence, we can perform arithmetic operations on the data. 
  • True Zero: There is no zero with interval scales. However, data can be rescaled in a manner that contains zero. An interval scales measure from 1 to 9 remains the same as 11 to 19 because we added 10 to all values. Similarly, a 1 to 9 interval scale is the same a -4 to 4 scale because we subtracted 5 from all values. Although the new scale contains zero, zero remains uninterruptable because it only appears in the scale from the transformation. 
Appropriate statistics for interval scales: count, frequencies, mode, median, mean, standard deviation (and variance), skewness, and kurtosis. Displays: histograms or bar charts, line charts, and scatter plots.

Ratio Scales
Ratio scales appear as nominal scales with a true zero. 
They have the following characteristics:
  • Order: The order of the responses or observations matters.
  • Distance: Ratio scales do do have an interpretable distance.
  • True Zero: There is a true zero.
Income is a classic example of a ratio scale:
  • Order is established. We would all prefer $100 to $1!
  • Zero dollars means we have no income (or, in accounting terms, our revenue exactly equals our expenses!)
  • Distance is interpretable, in that $20 appears as twice $10 and $50 is half of a $100.
For the web analyst, the statistics for ratio scales are the same as for interval scales.
Appropriate statistics for ratio scales: count, frequencies, mode, median, mean, 
standard deviation (and variance), skewness, and kurtosis.
Displays: histograms or bar charts, line charts, and scatter plots.

Comments

Popular posts from this blog

Big Data Analytics Programs

  List of Programs for Big Data Analytics   CLICK ON ME 1.  Implement the following Data structures in Java       a)  Linked Lists            b)   Stacks       c)  Queues     d)   Set            e)   Map 2.  Perform setting up and Installing Hadoop in its three operating modes:      Standalone,     Pseudo distributed,     Fully distributed. 3.  Implement the following file management tasks in Hadoop:    a) Adding files and directories    b) Retrieving files    c) Deleting files 4. Run a basic Word Count Map Reduce program to understand Map Reduce Paradigm. 5. Write a Map Reduce program that mines weather data.     Weather sensors collecting data every hour at many locations across the globe gather a large volume of log data, which is a good candidate for analysis with MapReduce since it is semi-structured and record-oriented. 6. Implement Matrix Multiplication with Hadoop Map Reduce 7. Write a MapReduce program to count the occurrence of similar words in a file. Use partitioner to part

How to Install Parrot Operating System in Virtual Box using OVA

Step by Step Process of Parrot OS Installation What is Parrot OS Parrot is a free and open-source Linux system based on Debian that is popular among security researchers, security experts, developers, and privacy-conscious users. It comes with cyber security and digital forensics arsenal that is totally portable. It also includes everything you'll need to make your own apps and protect your online privacy. Parrot is offered in Home and Security Editions, as well as a virtual machine and a Docker image, featuring the KDE and Mate desktop environments. Features of Parrot OS The following are some of the features of Parrot OS that set it apart from other Debian distributions: Tor, Tor chat, I2P, Anonsurf, and Zulu Crypt, which are popular among developers, security researchers, and privacy-conscious individuals, are included as pre-installed development, forensics, and anonymity applications. It has a separate "Forensics Mode" that does not mount any of the system's hard

Word Count Map Reduce program

  Aim: Run a basic Word Count Map Reduce program to understand Map Reduce Paradigm   Program: Source Code import java.io.IOException; import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration;// provides access to configuration parameters import org.apache.hadoop.fs.Path;// Path class names a file or directory in a HDFS import org.apache.hadoop.io.IntWritable;// primtive Writable Wrapper class for integers. import org.apache.hadoop.io.Text;// This class stores text and provides methods to serialize, deserialize, and compare texts at byte level import org.apache.hadoop.mapreduce.Job;//Job class allows the user to configure the job, submit it, control its execution, and query the state //The Hadoop Map-Reduce framework spawns one map task for each InputSplit generated by the InputFormat for the job import org.apache.hadoop.mapreduce.Mapper;//Maps input key/value pairs to a set of intermediate key/value pairs. import org.apache.hadoop.mapred