微软小冰入职东方卫视当主播 工作效率高过真人

paddle.distributed. new_group ( ranks: Optional[list[int]] = None, backend: Optional[Literal['nccl']] = None, timeout: timedelta = datetime.timedelta(seconds=1800), nccl_comm_init_option: int = 0 ) Group [source]
百度 究其原因,一是他有次跟别的朋友开家用MPV去郊游给托底了,觉得还是SUV好一些,再者就是他之前一直以为7座SUV都严重超预算,压根没想过!这不,现在又把关注点放在下面这几款7座SUV上了,不仅空间同样大,同样皮实耐用,而且通过性更好一些,就算平时代步,像功能强大的多媒体系统之类,一样不差。

Creates a new distributed communication group.

Parameters
  • ranks (list) – The global ranks of group members.

  • backend (str) – The backend used to create group, only nccl is supported now.

  • timeout (datetime.timedelta, optional) – The waiting timeout for store relevant options, default is 30 minutes.

Returns

The group instance.

Return type

Group

Examples

>>> 
>>> import paddle

>>> paddle.distributed.init_parallel_env()
>>> tindata = paddle.randn(shape=[2, 3])
>>> gp = paddle.distributed.new_group([2, 4, 6])
>>> paddle.distributed.all_reduce(tindata, group=gp, sync_op=False)