Skip to main content
2 votes
2 answers
49 views

I am working with the meta-feature extractor package: pymfe for complexity analysis. On a small dataset, this is not a problem, for example. pip install -U pymfe from sklearn.datasets import ...
arilwan's user avatar
  • 4,111
0 votes
1 answer
199 views

I'm trying to implement this paper: https://arxiv.org/pdf/2212.07677 (Here's their code): https://github.com/google-research/self-organising-systems/tree/master/transformers_learn_icl_by_gd I'm ...
William Convertino's user avatar
1 vote
0 answers
56 views

I want to train a model for subjective question scoring using ALBERT and a Siamese network, which consists of a bidirectional LSTM and a fully connected layer. During training, I've noticed that the ...
Ken's user avatar
  • 17
0 votes
0 answers
37 views

following is the code I'm trying to implement for meta learning using maml algorithm on a specific dataset. Inner loop works well, don't know why in the outerloop gradients are None. def maml(...
Pranav Belhekar's user avatar
2 votes
1 answer
732 views

I have a complex data in which there are only two features and around 18211 labels(Multi-label regression problem). The two features are categorical data, feature1 has 6 categories(A,B,C,D,E,F) and ...
Pranav Belhekar's user avatar
0 votes
1 answer
207 views

I have used CLIP embeddings of image and text as the input and the output is a label ranging from 0 to 5 (6 way label). I tried to make an implemention of this multimodal 6 way classification using ...
varun80042's user avatar
0 votes
1 answer
172 views

I am training VGG11 on a custom image dataset for 3-way 5-shot image classification using MAML from learn2learn. I am encapsulating the whole VGG11 model with MAML, i.e., not just the classification ...
The Exile's user avatar
  • 724
0 votes
0 answers
130 views

I am solving a meta-learning problem using Reptile Algorithm as used here. I have two datasets. One contains the following classes: iris, pupil, and sclera along with their annotations. Another ...
Na462's user avatar
  • 11
0 votes
0 answers
126 views

I am working on an implementation of MAML (see https://arxiv.org/pdf/1703.03400.pdf) in Jax. When training on a distribution of simple linear regression tasks it seems to perform fine (takes a while ...
Sefton de Pledge's user avatar
0 votes
1 answer
361 views

I want to train a model that perform a few-shot image classification using CIFAR-10. So I have to train the model with a small amount of classes and use the rest of the classes for the testing. I'm ...
Giulia's user avatar
  • 11
0 votes
1 answer
859 views

I was using the mini-imagenet data set and noticed this line of code: elif data_augmentation == 'lee2019: normalize = Normalize( mean=[120.39586422 / 255.0, 115.59361427 / 255....
Charlie Parker's user avatar
1 vote
1 answer
173 views

I have many pre-trained models with a different number of layers (Models are not Sequential). Training data had a shape (1, 1, 103) for these models and output was a class label between 0 and 9. I ...
Karan Owalekar's user avatar
0 votes
0 answers
54 views

I've been reading this research paper- https://arxiv.org/abs/1908.00413, and trying to implement the code from GitHub- https://github.com/hoyeoplee/MeLU, however, I run into a runtime error while ...
Siddharth Mehrotra's user avatar
0 votes
0 answers
280 views

I wanted to use the means, stds from training rather than batch stats since it seems if I use batch statistics my model diverges (as outline here When should one call .eval() and .train() when doing ...
Charlie Parker's user avatar
1 vote
1 answer
407 views

I was going through the omniglot maml example and saw that they have net.train() at the top of their testing code. This seems like a mistake since that means the stats from each task at meta-testing ...
Charlie Parker's user avatar

15 30 50 per page