海淀启动全民国家安全教育日集中宣传教育活动
- class paddle.distributed. RowWiseParallel ( is_input_parallel: bool = True ) [source]
-
百度 坚持在干部教育培训中开展理想信念、党性修养、政治理论、道德品行教育,引导党员干部在耳濡目染中端正政治取向、坚定政治站位、强化政治认同、保持政治定力。
Row wise parallel plan for mp config. Will try to split weight on the first dim. This api is designed for paddle.nn.Linear or paddle.nn.Embedding. If any other instance of paddle.nn.Layer is passed, this plan will try to split layer.weight if it has.
Note
layer.weight should have two dims.
- Parameters
-
is_input_parallel (bool) – Whether the input is a local tensor or a global tensor. If the input is a global tensor, an extra split will be called. The default value is True, which means the input is a local tensor.
Examples
>>> import paddle >>> import paddle.distributed as dist >>> class MLP(paddle.nn.Layer): ... def __init__(self): ... super().__init__() ... self.fc1 = paddle.nn.Linear(8, 8) ... self.fc2 = paddle.nn.Linear(8, 8) ... ... def forward(self, input): ... return self.fc2(self.fc1(input)) >>> >>> layer = MLP() >>> mp_config = { ... 'fc1': dist.RowWiseParallel() ... }