Month: October 2020
Thresholds that start with colon : don’t work
Thresholds that start with colon : don’t work. I was reading this: #205 (comment) Just a number is treated as -c :15 which means anything…
Can’t load config for [community model]
Although I can use a fine-tuned GPT2 model from code, the model page complains about the config file (which is already uploaded). at https://huggingface.co/akhooli/gpt2-small-arabic-poetry (for…
benchmarking API: `no_` arguments, double negation, defaults
Another discussion moved to an issue: Here is the gist of it: This help entry: https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_args_utils.py#L74 goes: “help”: “Don’t use multiprocessing for memory and speed…
Unable to make inference from hosted api for a pretrained model that I uploaded.
I have successfully managed to upload a model (https://huggingface.co/shrugging-grace/tweetclassifier) via the transformers cli. I am also able to generate inferences from a local Jupyter notebook…
[testing] USE_CUDA default and intuitive skip decorators
This library’s primarily use is for gpu work, and currently many tests won’t run even if gpu is available, since the current setup wants env…
ZeroDivisionError with Reformer
Environment info transformers version: 3.0.0 Platform: Linux-5.4.0-42-generic-x86_64-with-Ubuntu-18.04-bionic Python version: 3.6.9 PyTorch version (GPU?): 1.6.0 (True) Tensorflow version (GPU?): not installed (NA) Using GPU in script?:…
i have used t5_base for abstractive summarization but it is not giving good results,Could you please give me solution for this
🖥 Benchmarking transformers Benchmark Which part of transformers did you benchmark? Set-up What did you run your benchmarks on? Please include details, such as: CPU,…
Longformer Memory Consumption query
Hello, Apologies if I am misunderstanding it, but if I use roberta with a max sequence length of 256 and I can run, for example,…
Upladed model is not indexed
❓ Questions & Help Hi guys, I uploaded a model several hours ago (t5-base-finetuned-boolq) and it is not indexed in the model hub search engine…
Longformer run error
❓ Questions & Help Details When I train on a classification model by Longformer def forward(self,input): embding=input[‘enc’] att_mask=input[‘mask’] att_mask[:,[100,300,500,800,1200,]]=2 labels=input[‘targets’] print(‘jeff:’,embding.device,att_mask.device,self.l1.device,embding.shape,att_mask.shape,self.maxlen) logit=self.l1(inputs_embeds=embding,attention_mask=att_mask)#[:2] return [logit,labels] Meet…
Recent comments