arkouda.scipy

Subpackages

Submodules

Package Contents

Classes

Power_divergenceResult

The results of a power divergence statistical test.

Functions

chisquare(f_obs[, f_exp, ddof])

Computes the chi square statistic and p-value.

power_divergence(f_obs[, f_exp, ddof, lambda_])

Computes the power divergence statistic and p-value.

class arkouda.scipy.Power_divergenceResult[source]

Bases: namedtuple('Power_divergenceResult', ('statistic', 'pvalue'))

The results of a power divergence statistical test.

statistic
Type:

numpy.float64

pvalue
Type:

numpy.float64

arkouda.scipy.chisquare(f_obs, f_exp=None, ddof=0)[source]

Computes the chi square statistic and p-value.

Parameters:
  • f_obs (pdarray) – The observed frequency.

  • f_exp (pdarray, default = None) – The expected frequency.

  • ddof (int) – The delta degrees of freedom.

Return type:

arkouda.akstats.Power_divergenceResult

Examples

>>> import arkouda as ak
>>> ak.connect()
>>> from arkouda.akstats import chisquare
>>> chisquare(ak.array([10, 20, 30, 10]), ak.array([10, 30, 20, 10]))
Power_divergenceResult(statistic=8.333333333333334, pvalue=0.03960235520756414)

See also

scipy.stats.chisquare, arkouda.akstats.power_divergence

References

[1] “Chi-squared test”, https://en.wikipedia.org/wiki/Chi-squared_test

[2] “scipy.stats.chisquare”, https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.chisquare.html

arkouda.scipy.power_divergence(f_obs, f_exp=None, ddof=0, lambda_=None)[source]

Computes the power divergence statistic and p-value.

Parameters:
  • f_obs (pdarray) – The observed frequency.

  • f_exp (pdarray, default = None) – The expected frequency.

  • ddof (int) – The delta degrees of freedom.

  • lambda (string, default = "pearson") –

    The power in the Cressie-Read power divergence statistic. Allowed values: “pearson”, “log-likelihood”, “freeman-tukey”, “mod-log-likelihood”, “neyman”, “cressie-read”

    Powers correspond as follows:

    ”pearson”: 1

    ”log-likelihood”: 0

    ”freeman-tukey”: -0.5

    ”mod-log-likelihood”: -1

    ”neyman”: -2

    ”cressie-read”: 2 / 3

Return type:

arkouda.akstats.Power_divergenceResult

Examples

>>> import arkouda as ak
>>> ak.connect()
>>> from arkouda.akstats import power_divergence
>>> x = ak.array([10, 20, 30, 10])
>>> y = ak.array([10, 30, 20, 10])
>>> power_divergence(x, y, lambda_="pearson")
Power_divergenceResult(statistic=8.333333333333334, pvalue=0.03960235520756414)
>>> power_divergence(x, y, lambda_="log-likelihood")
Power_divergenceResult(statistic=8.109302162163285, pvalue=0.04380595350226197)

See also

scipy.stats.power_divergence, arkouda.akstats.chisquare

Notes

This is a modified version of scipy.stats.power_divergence [2] in order to scale using arkouda pdarrays.

References

[1] “scipy.stats.power_divergence”, https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.power_divergence.html

[2] Scipy contributors (2024) scipy (Version v1.12.0) [Source code]. https://github.com/scipy/scipy