Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: cannot unpack non-iterable NoneType object while importing fairseq #48

Open
subnkve479 opened this issue Sep 19, 2022 · 5 comments

Comments

@subnkve479
Copy link

subnkve479 commented Sep 19, 2022

Hi Team

I am trying to reproduce indicTrans_python_interface.ipynb. I am not able to import fairseq library .Below is the error I am facing.

Screenshot 2022-09-19 at 3 59 03 PM

image

I could see same issue is still open in fairseq github repo. I tried installing torch and torchvision packages as mentioned in below link but still I am facing the same issue.

facebookresearch/fairseq#4214

This is issue is blocking me to train my own model as well using IndicTrans_training.ipynb .

I could see the import is successful in indicTrans_python_interface.ipynb. with few warnings.

Below is link to my notebook

https://colab.research.google.com/drive/1e0G_jDe8_0hd-xtj1e4zhwqJQNtE86EC?usp=sharing

Can you please help me here

Regards
Subbu

@subnkve479
Copy link
Author

@anoopkunchukuttan, @GokulNC , @sumanthd17 Can you please help me here

Regards
Subbu

@HarryHe11
Copy link

@anoopkunchukuttan, @GokulNC , @sumanthd17 Can you please help me here

Regards Subbu

Yeah, I had the same issue. I can't load any translation model. Please check the following error message:

Using cache found in /root/.cache/torch/hub/pytorch_fairseq_main
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
[<ipython-input-24-5863cdfe7acc>](https://localhost:8080/#) in <module>
      2 
      3 # Load an En-Fr Transformer model trained on WMT'14 data :
----> 4 en2fr = torch.hub.load('pytorch/fairseq', 'transformer.wmt14.en-fr', tokenizer='moses', bpe='subword_nmt')
      5 
      6 # Use the GPU (optional):

7 frames
/usr/local/lib/python3.7/dist-packages/torch/hub.py in load(repo_or_dir, model, source, trust_repo, force_reload, verbose, skip_validation, *args, **kwargs)

/usr/local/lib/python3.7/dist-packages/torch/hub.py in _load_local(hubconf_dir, model, *args, **kwargs)

[/usr/local/lib/python3.7/dist-packages/torch/hub.py](https://localhost:8080/#) in _import_module(name, path)
     87     return '[https://github.com/{}/{}/archive/{}.zip](https://github.com/%7B%7D/%7B%7D/archive/%7B%7D.zip)'.format(repo_owner, repo_name, branch)
     88 
---> 89 
     90 def _load_attr_from_module(module, func_name):
     91     # Check if callable is defined in the module

/usr/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module)

/usr/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

[~/.cache/torch/hub/pytorch_fairseq_main/hubconf.py](https://localhost:8080/#) in <module>
     37 
     38 # only do fairseq imports after checking for dependencies
---> 39 from fairseq.hub_utils import (  # noqa; noqa
     40     BPEHubInterface as bpe,
     41     TokenizerHubInterface as tokenizer,

[~/.cache/torch/hub/pytorch_fairseq_main/fairseq/__init__.py](https://localhost:8080/#) in <module>
     31 hydra_init()
     32 
---> 33 import fairseq.criterions  # noqa
     34 import fairseq.distributed  # noqa
     35 import fairseq.models  # noqa

[~/.cache/torch/hub/pytorch_fairseq_main/fairseq/criterions/__init__.py](https://localhost:8080/#) in <module>
     22     CRITERION_DATACLASS_REGISTRY,
     23 ) = registry.setup_registry(
---> 24     "--criterion", base_class=FairseqCriterion, default="cross_entropy"
     25 )
     26 

TypeError: cannot unpack non-iterable NoneType object

@subnkve479
Copy link
Author

Hi Team
Can you please help me on this issue

Regards
Subbu

@HarryHe11
Copy link

Hi Team Can you please help me on this issue

Regards Subbu

Hey body, I guess this annoying issue happens because of the fairseq repo instead of the contributors of this repo. I am recently having the same issue. No one has given me any responses at this point.

@subnkve479
Copy link
Author

subnkve479 commented Sep 26, 2022

Hi Team Can you please help me on this issue
Regards Subbu

Hey body, I guess this annoying issue happens because of the fairseq repo instead of the contributors of this repo. I am recently having the same issue. No one has given me any responses at this point.

Yes. There are many open issues in fairseq repo regarding this error. But I could see successful run in the notebook shared in this repo.Hence requesting them the workaround they have followed if any

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants