Skip to content Skip to sidebar Skip to footer

Multi Class Classification Using InceptionV3,VGG16 With 101 Classes Very Low Accuracy

I am trying to build a food classification model with 101 classes. The dataset has 1000 image for each class. The accuracy of the model which I trained is coming less than 6%. I ha

Solution 1:

There are few that may improve the classification accuracy:

  1. Use EfficientNet with noisy_student weights. There are less number of parameters to train. It gives better accuracy due to the scalable architecture it has.

  2. You can use test time augmentation. In your test data generator, do a simple horizontal flip, vertical flip (if data looks realistic) and affine transformations. It will generate multiple views of the data and helps the model to average out more probable class.

  3. Checkout imgaug library (embossing, sharpening, noise addition, etc.). Plus, there are random_eraser, cut out and mix up strategies that have been proved to be useful.

  4. Try label smoothing. It can also help your classifier to give more probability to the correct class.

  5. Try learning rate warmup. Something like this:

LR_START = 0.0001
LR_MAX = 0.00005
LR_MIN = 0.0001
LR_RAMPUP_EPOCHS = 4
LR_SUSTAIN_EPOCHS = 6
LR_EXP_DECAY = .8


def lrfn(epoch):
    if epoch < LR_RAMPUP_EPOCHS:
        lr = (LR_MAX - LR_START) / LR_RAMPUP_EPOCHS * epoch + LR_START
    elif epoch < LR_RAMPUP_EPOCHS + LR_SUSTAIN_EPOCHS:
        lr = LR_MAX
    else:
        lr = (LR_MAX - LR_MIN) * LR_EXP_DECAY**(epoch - LR_RAMPUP_EPOCHS - LR_SUSTAIN_EPOCHS) + LR_MIN
    return lr
  1. You can also extract features and apply ensemble feature classification(XGBoost, Adaboost, BaggingClassifier) or triplet loss.

Post a Comment for "Multi Class Classification Using InceptionV3,VGG16 With 101 Classes Very Low Accuracy"