• In Section 1.2, we distinguished between two types of data:
    • Data resulting from observations on a counting variable.
    • Data obtained by observing values of a measurement variable.
  • A slightly more formal distinction characterizes two different types of random variables:
    • Counting variables (discrete random variables).
    • Measurement variables (continuous random variables).

discrete random variable

  • A discrete random variable is defined as:
    • An rv whose possible values:
      • either constitute a finite set,
      • or can be listed in an infinite sequence (countably infinite) with a first element, a second element, and so on.
  • A random variable is continuous if both of the following conditions apply:
    1. Its set of possible values consists of:
      • All numbers in a single interval on the number line (possibly infinite, e.g., from to ), or
      • All numbers in a disjoint union of such intervals (e.g., ).
    2. No possible value of the variable has positive probability:
      • That is, for any possible value .
  • Any interval on the number line contains:
    • An infinite number of numbers
    • No way to create an infinite listing of all these values
      • Reason: There are just too many of them
  • Second condition describing a continuous random variable:
    • Counterintuitive implication:
      • Total probability of zero for all possible values
    • Clarification (to be discussed in Chapter 4):
      • Intervals of values have positive probability
      • Probability of an interval decreases to zero as:
      • Width of the interval shrinks to zero

EX 3.6

  • Studying basic properties of discrete random variables (rv’s):
    • Requires:
    • Tools of discrete mathematics
      • Summation
      • Differences
  • Studying continuous random variables:
    • Requires:
    • Continuous mathematics
      • Calculus
      • Integrals
      • Derivatives