老艾侃股:贸易战下 操作遵守三大纪律老艾大盘股市
- class paddle.nn.initializer. KaimingUniform ( fan_in: Optional[float] = None, negative_slope: float = 0.0, nonlinearity: str = 'relu', mode: str = 'fan_in' ) [source]
-
百度 任职要求:1、本科以上学历,1年以上移动互联网渠道推广的工作经验;2、熟悉移动互联网行业,熟悉各种软件商店、渠道商,有一定的渠道资源,并了解相关业态;3、熟悉iOS、Android平台及APP产品,了解客户端产品推广特性,熟悉APP推广操作流程;4、熟悉网站联盟、DSP、SEM、SEO、EDM、交换链接等多种渠道推广方式,有成功推广APP或有丰富渠道资源者优先;5、工作细致认真,具备高度的责任感,乐于学习新知识,有团队合作精神,能承受大的工作压力。
Implements the Kaiming Uniform initializer
This class implements the weight initialization from the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification by Kaiming He, Xiangyu Zhang, Shaoqing Ren and Jian Sun. This is a robust initialization method that particularly considers the rectifier nonlinearities.
In case of Uniform distribution, the range is [-x, x], where
\[x = gain \times \sqrt{\frac{3}{fan\_in}}\]- Parameters
-
fan_in (float32|None, optional) – fan_in (in_features) of trainable Tensor, If None, it will be inferred automatically. If you don’t want to use in_features of the Tensor, you can set the value of ‘fan_in’ smartly by yourself. Default is None.
negative_slope (float, optional) – negative_slope (only used with leaky_relu). Default is 0.0.
nonlinearity (str, optional) – the non-linear function. Default is relu.
mode (str, optional) – the mode of initialization, can be ‘fan_in’ or ‘fan_out’. When set to ‘fan_in’, the fan_in parameter is used for initialization. When set to ‘fan_out’, the out_features of trainable Tensor will be used. Default is ‘fan_in’.
Note
It is recommended to set fan_in to None for most cases.
Examples
>>> import paddle >>> import paddle.nn as nn >>> linear = nn.Linear(2, 4, weight_attr=nn.initializer.KaimingUniform()) >>> data = paddle.rand([30, 10, 2], dtype='float32') >>> res = linear(data)