"Classifying plankton with deep neural networks" notes

  • cross entropy loss is not quite the same as optimizing classification accuracy. Althougth the two are correlated.
  • It's not necessarily true that Deep learning approaches are often said to require enormous amount of data to work well. In this competitation, there'are 30,000 examples for 121 classes.
    To achieve this, some tricks are :

    • dropout
    • weight decay
    • data argumentation
    • pre-training
    • pseudo-labeling
    • parameter sharing
  • The method is implemented based on Theano:

    • Python, Numpy, Theano, cuDNN, PyCUDA, Lasagne
    • scikit-image: pre-processing and data argumentation
    • ghalton: quasi-random number generation
  • Hardware:

    • GTX 980, GTX 680, Tesla K40
  • Pre-processing and data argumentation:

    • Normalization: pre-pixel zero mean unit variance

to be finished

你可能感兴趣的:(NetWork)