whiskey的冰: 解决方案:from import DistributedSampler # 创建分布式采样器 train_sampler = DistributedSampler( dataset=train_dataset, shuffle=True # 在DistributedSampler中设置shuffle=True ) # 创建DataLoader时,不要设置shuffle参数,只使用sampler train_loader = DataLoader( dataset=train_dataset, batch_size=batch_size, sampler=train_sampler, # 使用分布式采样器 num_workers=num_workers ) 然后在训练时,记得在每个epoch开始前调用sampler的set_epoch方法: for epoch in range(num_epochs): train_sampler.set_epoch(epoch) # 确保每个epoch的随机顺序都不同 # 训练循环
拉丁字母表及中英文发音
[Pytorch] ValueError: sampler option is mutually exclusive with shuffle