betainc - Maple Help

DeepLearning,Tensor,betainc

compute the incomplete beta function on entries in a Tensor

DeepLearning,Tensor,expm1

compute the expm1 of entries in a Tensor

DeepLearning,Tensor,lbeta

compute the lbeta of entries in a Tensor

DeepLearning,Tensor,lgamma

compute the lgamma of entries in a Tensor

DeepLearning,Tensor,log1p

compute the log1p of entries in a Tensor

DeepLearning,Tensor,log_sigmoid

compute the log_sigmoid of entries in a Tensor

DeepLearning,Tensor,rsqrt

compute the rsqrt of entries in a Tensor

DeepLearning,Tensor,sigmoid

compute the sigmoid of entries in a Tensor

 Calling Sequence betainc(t,u,v,opts)   expm1(t,opts) lbeta(t,opts)         lgamma(t,opts) log1p(t,opts)         log_sigmoid(t,opts) rsqrt(t,opts)         sigmoid(t,opts)

Parameters

 t - Tensor u - Tensor v - Tensor opts - zero or more options as specified below

Options

 • name=string

The value of option name specifies an optional name for this Tensor, to be displayed in output and when visualizing the dataflow graph.

Description

 • The betainc(t,u,v,opts) command computes the incomplete Beta function of entries in a Tensor.
 • The expm1(t,opts) command computes the complex expm1 of entries in a Tensor.
 • The lbeta(t,opts) command computes the lbeta of entries in a Tensor.
 • The lgamma(t,opts) command computes the lgamma of entries in a Tensor.
 • The log1p(t,opts) command computes the log1p of entries in a Tensor.
 • The log_sigmoid(t,opts) command computes the log-sigmoid of entries in a Tensor.
 • The rsqrt(t,opts) command computes the reciprocal of the square root of entries in a Tensor.
 • The sigmoid(t,opts) command computes the sigmoid of entries in a Tensor.

Examples

 > $\mathrm{with}\left(\mathrm{DeepLearning}\right):$
 > $X≔\mathrm{Matrix}\left(\left[\left[11.0,18.3\right],\left[12.1,20.3\right]\right],\mathrm{datatype}=\mathrm{float}\left[8\right]\right)$
 ${X}{≔}\left[\begin{array}{cc}{11.}& {18.3000000000000}\\ {12.1000000000000}& {20.3000000000000}\end{array}\right]$ (1)
 > $Y≔\mathrm{Matrix}\left(\left[\left[96.0,12.8\right],\left[8.7,27.6\right]\right],\mathrm{datatype}=\mathrm{float}\left[8\right]\right)$
 ${Y}{≔}\left[\begin{array}{cc}{96.}& {12.8000000000000}\\ {8.70000000000000}& {27.6000000000000}\end{array}\right]$ (2)
 > $Z≔\mathrm{Matrix}\left(\left[\left[1.,1.\right],\left[1.,1.\right]\right],\mathrm{datatype}=\mathrm{float}\left[8\right]\right)$
 ${Z}{≔}\left[\begin{array}{cc}{1.}& {1.}\\ {1.}& {1.}\end{array}\right]$ (3)
 > $\mathrm{t1}≔\mathrm{Constant}\left(X\right)$
 ${\mathrm{t1}}{≔}\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]$ (4)
 > $\mathrm{t2}≔\mathrm{Constant}\left(Y\right)$
 ${\mathrm{t2}}{≔}\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]$ (5)
 > $\mathrm{t3}≔\mathrm{Constant}\left(Z\right)$
 ${\mathrm{t3}}{≔}\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]$ (6)
 > $\mathrm{betainc}\left(\mathrm{t1},\mathrm{t2},\mathrm{t3}\right)$
 ${\mathrm{betainc}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]{,}\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]{,}\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (7)
 > $\mathrm{expm1}\left(\mathrm{t1}\right)$
 ${\mathrm{expm1}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (8)
 > $\mathrm{lbeta}\left(\mathrm{t1}\right)$
 ${\mathrm{lbeta}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (9)
 > $\mathrm{lgamma}\left(\mathrm{t1}\right)$
 ${\mathrm{lgamma}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (10)
 > $\mathrm{log1p}\left(\mathrm{t1}\right)$
 ${\mathrm{log1p}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (11)
 > $\mathrm{log_sigmoid}\left(\mathrm{t2}\right)$
 ${\mathrm{log_sigmoid}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (12)
 > $\mathrm{rsqrt}\left(\mathrm{t2}\right)$
 ${\mathrm{rsqrt}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (13)
 > $\mathrm{sigmoid}\left(\mathrm{t2}\right)$
 ${\mathrm{sigmoid}}{}\left(\left[\begin{array}{c}{\mathrm{DeepLearning Tensor}}\\ {\mathrm{Name: none}}\\ {\mathrm{Shape: undefined}}\\ {\mathrm{Data Type: float\left[8\right]}}\end{array}\right]\right)$ (14)

Compatibility

 • The DeepLearning,Tensor,betainc, DeepLearning,Tensor,expm1, DeepLearning,Tensor,lbeta, DeepLearning,Tensor,lgamma, DeepLearning,Tensor,log1p, DeepLearning,Tensor,log_sigmoid, DeepLearning,Tensor,rsqrt and DeepLearning,Tensor,sigmoid commands were introduced in Maple 2018.