添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement . We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account The "freeze_support()" line can be omitted if the program is not going to be frozen to produce an executable. The "freeze_support()" line can be omitted if the program is not going to be frozen to produce an executable. c1a1o1 opened this issue Oct 18, 2018 · 5 comments

:\Users\caocao\Anaconda3\python.exe D:/work/work11/SingleGAN-master/train.py --dataroot datasets/apple2orange --checkpoints_dir checkpoints --display_id 3 --name base_apple2orange --mode base --loadSize 143 --fineSize 138 --input_nc 3 --niter 100 --niter_decay 100 --lambda_ide 1 --display_port 8097 --batchSize 1 --ngf 64 --ndf 64
------------ Options -------------
batchSize: 1
c_gan_mode: lsgan
c_num: 0
checkpoints_dir: checkpoints
continue_train: False
d_num: 2
dataroot: datasets/apple2orange
display_freq: 400
display_id: 3
display_port: 8097
display_winsize: 256
e_blocks: 6
fineSize: 138
gpu: 0
init_type: xavier
input_nc: 3
isTrain: True
is_flip: 1
lambda_c: 0.5
lambda_cyc: 10.0
lambda_ide: 1.0
lambda_kl: 0.01
loadSize: 143
lr: 0.0002
mode: base
nThreads: 4
name: base_apple2orange
ndf: 64
nef: 64
ngf: 64
niter: 100
niter_decay: 100
no_html: False
norm: instance
output_nc: 3
print_freq: 200
save_epoch_freq: 5
save_latest_freq: 10000
up_type: Trp
update_html_freq: 4000
which_epoch: latest
-------------- End ----------------
Start preprocessing dataset..!
Finished preprocessing dataset..!
create web directory checkpoints\base_apple2orange\2018_10_18_15_52_57\web...
D:\work\work11\SingleGAN-master\models\model.py:46: UserWarning: nn.init.xavier_normal is now deprecated in favor of nn.init.xavier_normal_.
init.xavier_normal(m.weight.data, gain=math.sqrt(2))
D:\work\work11\SingleGAN-master\models\model.py:56: UserWarning: nn.init.constant is now deprecated in favor of nn.init.constant_.
init.constant(m.bias.data, 0.0)
------------ Options -------------
batchSize: 1
c_gan_mode: lsgan
c_num: 0
checkpoints_dir: checkpoints
continue_train: False
d_num: 2
dataroot: datasets/apple2orange
display_freq: 400
display_id: 3
display_port: 8097
display_winsize: 256
e_blocks: 6
fineSize: 138
gpu: 0
init_type: xavier
input_nc: 3
isTrain: True
is_flip: 1
lambda_c: 0.5
lambda_cyc: 10.0
lambda_ide: 1.0
lambda_kl: 0.01
loadSize: 143
lr: 0.0002
mode: base
nThreads: 4
name: base_apple2orange
ndf: 64
nef: 64
ngf: 64
niter: 100
niter_decay: 100
no_html: False
norm: instance
output_nc: 3
print_freq: 200
save_epoch_freq: 5
save_latest_freq: 10000
up_type: Trp
update_html_freq: 4000
which_epoch: latest
-------------- End ----------------
Start preprocessing dataset..!
Finished preprocessing dataset..!
create web directory checkpoints\base_apple2orange\2018_10_18_15_53_02\web...
D:\work\work11\SingleGAN-master\models\model.py:46: UserWarning: nn.init.xavier_normal is now deprecated in favor of nn.init.xavier_normal_.
init.xavier_normal(m.weight.data, gain=math.sqrt(2))
D:\work\work11\SingleGAN-master\models\model.py:56: UserWarning: nn.init.constant is now deprecated in favor of nn.init.constant_.
init.constant(m.bias.data, 0.0)
Traceback (most recent call last):
File "", line 1, in
Traceback (most recent call last):
File "D:/work/work11/SingleGAN-master/train.py", line 24, in
for i, data in enumerate(data_loader):
File "C:\Users\caocao\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 451, in iter
return _DataLoaderIter(self)
File "C:\Users\caocao\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 239, in init
w.start()
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\process.py", line 105, in start
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\spawn.py", line 106, in spawn_main
self._popen = self._Popen(self)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\context.py", line 212, in _Popen
exitcode = _main(fd)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\spawn.py", line 115, in _main
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\context.py", line 313, in _Popen
prepare(preparation_data)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\spawn.py", line 226, in prepare
return Popen(process_obj)
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 66, in init
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\spawn.py", line 278, in _fixup_main_from_path
reduction.dump(process_obj, to_child)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\reduction.py", line 59, in dump
run_name=" mp_main ")
File "C:\Users\caocao\Anaconda3\lib\runpy.py", line 254, in run_path
ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe
pkg_name=pkg_name, script_name=fname)
File "C:\Users\caocao\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\caocao\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "D:\work\work11\SingleGAN-master\train.py", line 24, in
for i, data in enumerate(data_loader):
File "C:\Users\caocao\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 451, in iter
return _DataLoaderIter(self)
File "C:\Users\caocao\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 239, in init
w.start()
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\context.py", line 212, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\context.py", line 313, in _Popen
return Popen(process_obj)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 34, in init
prep_data = spawn.get_preparation_data(process_obj._name)
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\spawn.py", line 144, in get_preparation_data
_check_not_importing_main()
File "C:\Users\caocao\Anaconda3\lib\multiprocessing\spawn.py", line 137, in _check_not_importing_main
is not going to be frozen to produce an executable.''')
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:
        if __name__ == '__main__':
            freeze_support()
    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.

Process finished with exit code 1

MalneediVamsi, adagymnast, jundeli, imsaumil, andreazignoli, TPF2017, kurkurzz, 1254249903, dengshuibing, Jialiangfan, and Diegocndd reacted with thumbs up emoji Diegocndd reacted with rocket emoji All reactions

from multiprocessing.pool import Pool
from multiprocessing import freeze_support
import requests

from timer import timer

URL = 'https://httpbin.org/uuid'

def fetch(session, url):
with session.get(url) as response:
print(response.json()['uuid'])

@Timer(1, 1)
def main():
with Pool() as pool:
with requests.Session() as session:
pool.starmap(fetch, [(session, URL) for _ in range(100)])

if name == 'main':
#freeze_support()
main()

i got same error while running

You may use if __name__ == '__main__:' to solve this problem in Windows (reference) or run this code in Linux.

That's OK. Thank u so much, but why?