Tagesbuch

Start from GAN
If feed all data in, after enough iterations, all output of generator would be 1 (using MNIST dataset), which is the simplest digit. --> "the generator fools discriminator with garbage"
training GAN for each classes individually --> 1. GAN structure is suitable for some classes, but when training some classes it leads to collapse mode; 2. not easy to select models for each classes

then to conditional GAN
similar structure, but with one-hot label concatenated to input of G and D
Advantage: no need to train model individually
Note: learning rate set to 0.001, 0.0001 would lead to bad result

then ACGAN
Current test shows ACGAN works not well while using two dense layers, reason might be that ACGAN only works when using convolutional D and G
todo: pretrain D

then Wasserstein GAN


  1. January
  1. January
    refine the proposal

10-12. January

  • implement a DC classifier for preparation to implement the discriminator
  • read Improved GAN, focus on this paper in following days
  1. January
  • DC classifier has no bugs, but performs awfully
  • install theano and lasagne to run the improvedGAN code
  1. - 19. January
  • finally install theano and its GPU backend correctly and fix a lot of deprecated issues
  1. January
  • try to translate it to keras, find way to implement the loss function
  1. January
  • translation to keras is way complicated, first try paviaU in the original theano code
  • 1D improved GAN is too bad for training paviaU (maybe the reason of the training data, check the training and testing data and resave them)
  1. January
  • prepare for the questions for tomorrow's meeting:
  • the loss function in the code does not match the loss in the paper, and the former has a very strange type
  • the l_lab and the train_err is the same thing
  • no implementation of K+1 class
  1. February
  • as to the 3d convolution, an idea: set stride=(1,1,2), which only manipulate the spectral dimension
  • try semi-supervised gan, discriminator classifies labeled sample, and generated sample as k+1, use unlabeled training data, set label as [0.1, 0.1, 0.1, ..., 0], on mnist dataset
  1. Feb. /- 9. Feb.
  • 1D tryout, seems good, need more tests
  1. March
    ready to test:
  • (replace conv3d to conv2d)
  • different training data size (count)
  • different patch size
  • different channel number
  • (different batch size)
  • (different deepwise conv channel)
  1. March
    find a case: the results that randomly choose 200 samples from the whole image as training set is much better than using randomly choose 200 samples from training set

  2. April

  • email to cluster team
  • try cross validation
  • ask Amir how to determine the final result
  • read the "discr_loss" blog, and try their code
  • read gan paper
  1. April
  • adam vs sgd
    the validation curve of using adam is up and down --> not suitable for normal early stopping algorithm
    try to fix: use smaller learning rate

  • alternative for progress (early stopping)
    not calculate the ratio of average training loss and min training loss in a training strip, but average training loss and past average training loss

  • learning rate decay strategy

  • optimizer for G and optimizer for D

  • use cross entropy loss of only first 9 labels to determine when to early stop

  • double check the Dataloader in demoGAN (zhu et al)(pytorch)

  1. April
  • test feature match, start from one layer model (ssgan_improved_pytorch)
  • try to implement custom loss function like keras
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 203,456评论 5 477
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 85,370评论 2 381
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 150,337评论 0 337
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 54,583评论 1 273
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 63,596评论 5 365
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,572评论 1 281
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 37,936评论 3 395
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,595评论 0 258
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 40,850评论 1 297
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,601评论 2 321
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,685评论 1 329
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 33,371评论 4 318
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 38,951评论 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 29,934评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 31,167评论 1 259
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 43,636评论 2 349
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 42,411评论 2 342

推荐阅读更多精彩内容

  • 写下一篇文字,开启一段新的旅程 当我在键盘上敲下第一个文字的时候,我的脑海里总是浮现出这样一段话:“这个博客到底是...
    TYB阅读 506评论 0 51
  • 作为一个资深手机卖家,经常会有朋友问我该如何选购手机,感谢朋友给予的信任,我愿意和大家分享我的一些心得体会,希望为...
    菲完美阅读 758评论 0 0
  • 在java多线程并发编程中,Synchronized一直占有很重要的角色。Synchronized通过获取锁来实现...
    Vinctor阅读 784评论 0 2