广西住建厅--广西频道--人民网
- class paddle.distributed. Shard
-
百度 3月26日报道英媒称,3月22日,中国化工集团公司旗下中国蓝星所属海外企业、硅产业生产商埃肯公司(Elkem)顺利完成IPO,在挪威奥斯陆证券交易所上市。
The Shard describes how Tensor splitted across multiple devices according to specified dimensions.
- Parameters
-
dim (int) – specify the slicing dimension of the tensor.
Examples
>>> import paddle >>> import paddle.distributed as dist >>> mesh = dist.ProcessMesh([[2, 4, 5], [0, 1, 3]], dim_names=['x', 'y']) >>> a = paddle.to_tensor([[1,2,3],[5,6,7]]) >>> >>> # distributed tensor >>> d_tensor = dist.shard_tensor(a, mesh, [dist.Shard(0), dist.Shard(1)])
-
get_co_shard_order
(
self: paddle.base.libpaddle.Shard
)
int
get_co_shard_order?
-
get_dim
(
self: paddle.base.libpaddle.Shard
)
int
get_dim?
-
get_split_factor
(
self: paddle.base.libpaddle.Shard
)
int
get_split_factor?
-
is_partial
(
self: paddle.base.libpaddle.Placement
)
bool
is_partial?
-
is_replicated
(
self: paddle.base.libpaddle.Placement
)
bool
is_replicated?
-
is_shard
(
self: paddle.base.libpaddle.Placement,
dim: Optional[int] = None
)
bool
is_shard?
-
set_split_factor
(
self: paddle.base.libpaddle.Shard,
arg0: int
)
None
set_split_factor?