Sunday, 15 April 2018

MCQ 4



76. Singly censoring is a special case of
      A.    Censoring.
      B.    Sampling
C.    Double censoring.
D.    Census.
77. Since in case of censored distribution, though we do not know the individual values of count but we know the number of observations hence we have some extra information in case of censored distribution than
A.   True distribution.
B.   Any distribution.
C.   Other distribution.
D.   Binomial Distribution.

78. The extra information in case of censored distance adds efficiency in our
      A.    Calculation process.
      B.    Poison process
C.    Estimation process.
D.    Statistical process.
   79.    The number of types of censoring
      A.    Two.
      B.    Three.
      C.    Four
      D.    Five.  
80. Gupta discriminated between two types of censored
      A.     Populations.
      B.     Cases.
      C.     Places
D.     Samples.
81. In type-I censoring we fix the time but the observations are
      A.    Random sequences.
B.    Random variables.
C.    Random Numbers.
D.    Random entries.

82. In type-II censoring we fix the number of observations before the
      A.     System stops.
      B.     Number stops.
C.     Experiment stops.
D.     Process stops.
83. We will stop the experiment after burning of 20 bulbs, is an example of
      A.    Type-II censoring.
      B.    Type-III censoring.
      C.    Type-I censoring.
      D.    Type-IV censoring.
84. The theory of type-II censoring is simpler than
      A.    Type-I censoring.
      B.    Type-III censoring.
      C.    Type-V censoring.
      D.    Type-IV censoring
85. As the sample size increases the two types of censoring become
       A.    Different.
      B.    Overlapping.
      C.    Simple
D.    Equal.
 
86.  The ratio of two non-central chi square variables with n1 and n2 degree of freedom is non central F distribution with non centrality
          A.      Statistics   λ1 and   λ2 
              B.          Means   λ1 and   λ2
               C.       Parameters λ1 and   λ2
               D.      Variances.
87.  The families of the distribution in which the range of variables contain parameter is called
        A.     Regular distribution.
B.     Non-regular distribution.
C.     Simple distribution.
D.     t- distribution.

88.  MLE is a function of sufficient statistics for Ө if
        A.    Not Exists.
         B.   Exists but rare.
        C.    Exists but not simple.
D.    Exists.

89.  Likelihood function is the joint density of all the
A.     Population observations.
        B.     Observations.
        C.     Sample observations.
D.     General observations.

       
90.  A statistic is called efficient estimator of parameter if the variance of statistic attains
A.     CRLB.
B.     CRD.
C.     RCD.
D.     RCBD.
91.  If two estimators are given then that will be efficient whose variance will be
          A.     Maximum.
  B.      Small.
  C.       Large.
  D.      Intermediate.
92.  The minimum variance estimator is unique irrespective of whether any
         A.       Value is attained.
         B.       Number is attained
 C.       Bound is attained.
 D.       Limit is attained.
93.  The larger value of fisher information decides that the estimator is more precise
A.     Estimator.
B.     Calculator.
C.     Observer.
D.     Predictor.

94.  Any one to one function of sufficient statistics
        A.    Not need to be sufficient.
B.    Is also sufficient.
C.    Is efficient
D.    Un biased
95.  Sufficient estimator is most efficient if
A.       It exists.
B.       It is unbiased.
C.       It is biased.
D.       It is consistent.
96.  A statistic is minimum sufficient if it is a single valued function of all others vectors of
        A.       Population.
         B.      Observations
C.       Sufficient statistic.
D.       Sample.
97.  The main objects of sampling are to provide an estimate of population parameter or to provide maximum information about
A.      Population parameter.
B.      Statistics.
C.      An estimate.
 D.     Sample.
98.  The main object of probability theory is to determine the reliability of
        A.      Sampling.
        B.      Hypothesis.
C.      Estimate.
D.     Population parameter.

99.  To give reliable estimate about the unknown population parameter on the bass of sample observation is called
        A.     Inference theory.
        B.     Estimation theory.
        C.     Statistical theory.
        D.     Statistical inference.  
 100.       To give statement about the general on the basis of specific is called
        A.    Deductive inference.
        B.    Suggestion.
C.    Inductive inference.
D.    Statistical inference.





76
C
77
A
78
C
79
A
80
D
81
B
82
C
83
A
84
A
85
D
86
C
87
B
88
D
89
C
90
A
91
B
92
C
93
A
94
B
95
A
96
C
97
A
98
C
99
D
100
C






No comments:

Post a Comment

MCQ 27

51.     The least square estimates have maximum variances among all the linear unbiased biased estimates. A.         True. B.       ...