Searched refs:microbatch (Results 1 – 4 of 4) sorted by relevance
/aosp_15_r20/external/pytorch/torch/distributed/pipelining/ |
H A D | schedules.py | 29 from .microbatch import merge_chunks, split_args_kwargs_into_chunks, TensorChunkSpec 2050 def need_bubble(stage, op, microbatch, num_stages_global, seen_ops): argument 2052 if stage != 0 and (stage - 1, op, microbatch) not in seen_ops: 2056 return (stage, _ComputationType.FORWARD, microbatch) not in seen_ops 2057 return (stage + 1, op, microbatch) not in seen_ops 2086 stage_index, op, microbatch = temp_action 2088 stage_index, op, microbatch, num_stages_global, seen_ops 2091 if microbatch is not None: 2092 temp_seen_ops.add((stage_index, op, microbatch))
|
H A D | stage.py | 1165 microbatch: Optional[Union[torch.Tensor, List[torch.Tensor]]] = None, 1198 if microbatch is None: 1200 example_fwd_inputs = microbatch
|
/aosp_15_r20/external/pytorch/test/distributed/pipelining/ |
H A D | test_microbatch.py | 7 from torch.distributed.pipelining.microbatch import (
|
/aosp_15_r20/external/pytorch/docs/source/ |
H A D | distributed.pipelining.rst | 207 the runtime input to the stage, which would be one microbatch worth of input 454 .. automodule:: torch.distributed.pipelining.microbatch 456 .. currentmodule:: torch.distributed.pipelining.microbatch
|