About this Study Set
This study set covers Statistics & Probability through
18 practice questions.
This set of questions probes deep understanding of statistical theory, probability axioms, and their applications in scientific contexts. Every question includes the correct answer so you can learn as you go — pick any format above to get started.
Questions & Answers
Browse all 18 questions from the
Advanced Statistics and Probability Concepts study set below.
Each question shows the correct answer — select a study format above to practice interactively.
1
Which theorem establishes that the distribution of the sample mean, when the sample size is sufficiently large, approaches a normal distribution, irrespective of the population's distribution?
-
A
Bayes' Theorem
-
B
The Law of Large Numbers
-
C
The Central Limit Theorem
-
D
The Borel-Cantelli Lemma
2
In the context of hypothesis testing, what is the primary implication of a Type I error?
-
A
Failing to reject a false null hypothesis.
-
B
Rejecting a true null hypothesis.
-
C
Accepting a true alternative hypothesis.
-
D
Failing to accept a false alternative hypothesis.
3
The Cauchy-Schwarz inequality for random variables X and Y states that (E[XY])² ≤ E[X²]E[Y²]. What condition must hold for equality to occur?
-
A
X and Y must be independent.
-
B
X and Y must be identically distributed.
-
C
One variable must be a linear function of the other (Y = aX + b).
-
D
The expected values of X and Y must be zero.
4
Which of the following probability distributions is characterized by its memoryless property, meaning the probability of future events is independent of past events?
-
A
Binomial Distribution
-
B
Poisson Distribution
-
C
Geometric Distribution
-
D
Hypergeometric Distribution
5
What is the definition of a stationary process in time series analysis?
-
A
A process where the mean and variance change over time.
-
B
A process whose statistical properties, such as mean, variance, and autocorrelation, do not change over time.
-
C
A process that is strictly increasing.
-
D
A process where all future values are predictable from past values.
6
The concept of 'sufficient statistic' in statistical inference means a statistic that:
-
A
Is equal to the parameter being estimated.
-
B
Contains all the information in the sample relevant to the parameter.
-
C
Is always normally distributed.
-
D
Is independent of the parameter being estimated.
7
In Bayesian statistics, what is the posterior distribution?
-
A
The prior belief about a parameter before observing data.
-
B
The probability of observing the data given the parameter.
-
C
The updated probability distribution of a parameter after considering observed data.
-
D
The probability of the parameter being true.
8
What is the fundamental property of a sigma-algebra (or Borel field) of events in probability theory?
-
A
It contains only a finite number of events.
-
B
It is closed under countable unions, countable intersections, and complements.
-
C
It includes every possible outcome.
-
D
It only contains mutually exclusive events.
9
The principle of maximum likelihood estimation (MLE) aims to find the parameter values that:
-
A
Minimize the variance of the estimator.
-
B
Maximize the probability of observing the given sample data.
-
C
Minimize the bias of the estimator.
-
D
Ensure the estimator is unbiased.
10
Kolmogorov's axioms of probability, foundational to modern probability theory, include that probability is non-negative, the probability of the sample space is 1, and:
-
A
The probability of the union of two disjoint events is the product of their probabilities.
-
B
The probability of the union of disjoint events is countably additive.
-
C
The probability of any event is between 0 and 1.
-
D
The probability of an impossible event is 1.
11
What does the concept of 'conjugate prior' imply in Bayesian inference?
-
A
The posterior distribution is identical to the prior distribution.
-
B
The posterior distribution belongs to the same family of distributions as the prior.
-
C
The prior distribution is irrelevant to the posterior distribution.
-
D
The posterior distribution is always uniform.
12
The Cramér-Rao lower bound provides a theoretical limit for the variance of any unbiased estimator of a parameter. An estimator achieving this bound is called:
-
A
A consistent estimator.
-
B
A maximum likelihood estimator.
-
C
An efficient estimator.
-
D
A sufficient estimator.
13
In the context of multivariate statistics, what does the Jacobian matrix represent?
-
A
The determinant of the covariance matrix.
-
B
The rate of change of a vector-valued function.
-
C
The correlation between two variables.
-
D
The probability density of a joint distribution.
14
What is the fundamental characteristic of a Poisson process?
-
A
It models continuous events occurring at fixed intervals.
-
B
It models the number of events occurring in a fixed interval of time or space, with events occurring independently and at a constant average rate.
-
C
It models the probability of success in a fixed number of trials.
-
D
It models the time until the first success in a series of Bernoulli trials.
15
Which theorem states that for a sequence of independent and identically distributed random variables with finite variance, the sample mean converges in probability to the expected value?
-
A
Central Limit Theorem
-
B
Borel-Cantelli Lemma
-
C
Law of Large Numbers
-
D
Slutsky's Theorem
16
What is the primary purpose of a p-value in hypothesis testing?
-
A
To indicate the size of the effect.
-
B
To represent the probability of obtaining test results at least as extreme as the results actually observed, assuming the null hypothesis is true.
-
C
To state the probability that the null hypothesis is true.
-
D
To measure the power of the test.
17
The concept of ergodicity in stochastic processes implies that:
-
A
The process's statistical properties are independent of time.
-
B
Time averages are equal to ensemble averages.
-
C
The process has a constant variance.
-
D
The process converges to a normal distribution.
18
What is the definition of a 'sufficient statistic' in statistical inference?
-
A
A statistic whose distribution depends only on the parameter being estimated.
-
B
A statistic that provides all the information about the parameter that the entire sample contains.
-
C
A statistic that is always an unbiased estimator.
-
D
A statistic that is independent of the parameter being estimated.