-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathnohup.out
7556 lines (7383 loc) · 808 KB
/
nohup.out
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
WARNING:tensorflow:From /home/gxdai/Focal-Contrastive-Loss/model.py:202: get_regularization_losses (from tensorflow.contrib.losses.python.losses.loss_ops) is deprecated and will be removed after 2016-12-30.
Instructions for updating:
Use tf.losses.get_regularization_losses instead.
2018-10-22 10:22:13.520905: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2018-10-22 10:22:14.428501: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1411] Found device 0 with properties:
name: TITAN Xp major: 6 minor: 1 memoryClockRate(GHz): 1.582
pciBusID: 0000:05:00.0
totalMemory: 11.91GiB freeMemory: 54.06MiB
2018-10-22 10:22:14.428547: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1490] Adding visible gpu devices: 0
ALL the args information
Namespace(batch_size=32, ckpt_dir='./models/siamese', class_num=5, display_step=20, dropout_keep_prob=0.5, embedding_size=128, eval_step=10, evaluation=0, focal_decay_factor=2.0, height=512, image_txt='/data1/Guoxian_Dai/CUB_200_2011/images.txt', label_txt='/data1/Guoxian_Dai/CUB_200_2011/image_class_labels.txt', learning_rate=0.001, learning_rate2=0.0001, learning_rate_decay_type='exponential', loss_type='contrastive_loss', margin=1.0, mode='train', momentum=0.9, num_epochs1=20, num_epochs2=10, num_epochs_per_decay=5, num_workers=4, optimizer='rmsprop', pair_type='vector', pretrained_model_path='./weights/inception_v3.ckpt', restore_ckpt=0, root_dir='/data1/Guoxian_Dai/CUB_200_2011/images', targetNum=1000, train_test_split_txt='/data1/Guoxian_Dai/CUB_200_2011/train_test_split.txt', weightFile='./models/my-model', weight_decay=0.0005, width=512, with_regularizer=False)
Configuration information
********************
optimimizer = rmsprop
learning_rate = 0.00100
learning_rate_decay_type = exponential
loss_type = contrastive_loss
margin = 1.0
with_regularizer = 0
focal_decay_factor = 2.0
training image number is 5864
testing image number is 5924
inputs.get_shape().as_list() = [None, 1, 1, 1000]
embedding_vector.get_shape().as_list() = [None, 1, 1, 256]
embedding_vector.get_shape().as_list() = [None, 1, 1, 64]
inputs.get_shape().as_list() = [None, 1, 1, 1000]
embedding_vector.get_shape().as_list() = [None, 1, 1, 256]
embedding_vector.get_shape().as_list() = [None, 1, 1, 64]
***********************
the pair_type is vector
***********************
************************
*Build contrastive loss*
******************************
Traceback (most recent call last):
File "main.py", line 97, in <module>
main(args)
File "main.py", line 88, in main
with tf.Session(config=config) as sess:
File "/home/gxdai/py2/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1511, in __init__
super(Session, self).__init__(target, graph, config=config)
File "/home/gxdai/py2/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 634, in __init__
self._session = tf_session.TF_NewSessionRef(self._graph._c_graph, opts)
tensorflow.python.framework.errors_impl.InternalError: CUDA runtime implicit initialization on GPU:0 failed. Status: out of memory
WARNING:tensorflow:From /home/gxdai/Focal-Contrastive-Loss/model.py:202: get_regularization_losses (from tensorflow.contrib.losses.python.losses.loss_ops) is deprecated and will be removed after 2016-12-30.
Instructions for updating:
Use tf.losses.get_regularization_losses instead.
2018-10-22 10:27:27.404496: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2018-10-22 10:27:28.188127: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1411] Found device 0 with properties:
name: TITAN Xp major: 6 minor: 1 memoryClockRate(GHz): 1.582
pciBusID: 0000:0a:00.0
totalMemory: 11.91GiB freeMemory: 11.74GiB
2018-10-22 10:27:28.188169: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1490] Adding visible gpu devices: 0
2018-10-22 10:27:28.453386: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] Device interconnect StreamExecutor with strength 1 edge matrix:
2018-10-22 10:27:28.453461: I tensorflow/core/common_runtime/gpu/gpu_device.cc:977] 0
2018-10-22 10:27:28.453474: I tensorflow/core/common_runtime/gpu/gpu_device.cc:990] 0: N
2018-10-22 10:27:28.453939: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1103] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 11361 MB memory) -> physical GPU (device: 0, name: TITAN Xp, pci bus id: 0000:0a:00.0, compute capability: 6.1)
ALL the args information
Namespace(batch_size=32, ckpt_dir='./models/siamese', class_num=5, display_step=20, dropout_keep_prob=0.5, embedding_size=128, eval_step=10, evaluation=0, focal_decay_factor=2.0, height=512, image_txt='/data1/Guoxian_Dai/CUB_200_2011/images.txt', label_txt='/data1/Guoxian_Dai/CUB_200_2011/image_class_labels.txt', learning_rate=0.001, learning_rate2=0.0001, learning_rate_decay_type='exponential', loss_type='contrastive_loss', margin=1.0, mode='train', momentum=0.9, num_epochs1=20, num_epochs2=10, num_epochs_per_decay=5, num_workers=4, optimizer='rmsprop', pair_type='vector', pretrained_model_path='./weights/inception_v3.ckpt', restore_ckpt=0, root_dir='/data1/Guoxian_Dai/CUB_200_2011/images', targetNum=1000, train_test_split_txt='/data1/Guoxian_Dai/CUB_200_2011/train_test_split.txt', weightFile='./models/my-model', weight_decay=0.0005, width=512, with_regularizer=False)
Configuration information
********************
optimimizer = rmsprop
learning_rate = 0.00100
learning_rate_decay_type = exponential
loss_type = contrastive_loss
margin = 1.0
with_regularizer = 0
focal_decay_factor = 2.0
training image number is 5864
testing image number is 5924
inputs.get_shape().as_list() = [None, 1, 1, 1000]
embedding_vector.get_shape().as_list() = [None, 1, 1, 256]
embedding_vector.get_shape().as_list() = [None, 1, 1, 64]
inputs.get_shape().as_list() = [None, 1, 1, 1000]
embedding_vector.get_shape().as_list() = [None, 1, 1, 256]
embedding_vector.get_shape().as_list() = [None, 1, 1, 64]
***********************
the pair_type is vector
***********************
************************
*Build contrastive loss*
******************************
*** InceptionV3/Conv2d_1a_3x3/weights:0 ******
*** InceptionV3/Conv2d_1a_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Conv2d_2a_3x3/weights:0 ******
*** InceptionV3/Conv2d_2a_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Conv2d_2b_3x3/weights:0 ******
*** InceptionV3/Conv2d_2b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Conv2d_3b_1x1/weights:0 ******
*** InceptionV3/Conv2d_3b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Conv2d_4a_3x3/weights:0 ******
*** InceptionV3/Conv2d_4a_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_1/Conv2d_0b_5x5/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_1/Conv2d_0b_5x5/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_2/Conv2d_0b_3x3/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_2/Conv2d_0b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_2/Conv2d_0c_3x3/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_2/Conv2d_0c_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5b/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_5b/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_1/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_1/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_1/Conv_1_0c_5x5/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_1/Conv_1_0c_5x5/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_2/Conv2d_0b_3x3/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_2/Conv2d_0b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_2/Conv2d_0c_3x3/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_2/Conv2d_0c_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5c/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_5c/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_1/Conv2d_0b_5x5/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_1/Conv2d_0b_5x5/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_2/Conv2d_0b_3x3/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_2/Conv2d_0b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_2/Conv2d_0c_3x3/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_2/Conv2d_0c_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_5d/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_5d/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6a/Branch_0/Conv2d_1a_1x1/weights:0 ******
*** InceptionV3/Mixed_6a/Branch_0/Conv2d_1a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6a/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6a/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6a/Branch_1/Conv2d_0b_3x3/weights:0 ******
*** InceptionV3/Mixed_6a/Branch_1/Conv2d_0b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6a/Branch_1/Conv2d_1a_1x1/weights:0 ******
*** InceptionV3/Mixed_6a/Branch_1/Conv2d_1a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_1/Conv2d_0b_1x7/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_1/Conv2d_0b_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_1/Conv2d_0c_7x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_1/Conv2d_0c_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0b_7x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0b_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0c_1x7/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0c_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0d_7x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0d_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0e_1x7/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_2/Conv2d_0e_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6b/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_6b/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_1/Conv2d_0b_1x7/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_1/Conv2d_0b_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_1/Conv2d_0c_7x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_1/Conv2d_0c_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0b_7x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0b_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0c_1x7/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0c_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0d_7x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0d_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0e_1x7/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_2/Conv2d_0e_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6c/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_6c/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_1/Conv2d_0b_1x7/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_1/Conv2d_0b_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_1/Conv2d_0c_7x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_1/Conv2d_0c_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0b_7x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0b_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0c_1x7/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0c_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0d_7x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0d_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0e_1x7/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_2/Conv2d_0e_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6d/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_6d/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_1/Conv2d_0b_1x7/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_1/Conv2d_0b_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_1/Conv2d_0c_7x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_1/Conv2d_0c_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0b_7x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0b_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0c_1x7/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0c_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0d_7x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0d_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0e_1x7/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_2/Conv2d_0e_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_6e/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_6e/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7a/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7a/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7a/Branch_0/Conv2d_1a_3x3/weights:0 ******
*** InceptionV3/Mixed_7a/Branch_0/Conv2d_1a_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_0b_1x7/weights:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_0b_1x7/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_0c_7x1/weights:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_0c_7x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_1a_3x3/weights:0 ******
*** InceptionV3/Mixed_7a/Branch_1/Conv2d_1a_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_1/Conv2d_0b_1x3/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_1/Conv2d_0b_1x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_1/Conv2d_0b_3x1/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_1/Conv2d_0b_3x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0b_3x3/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0c_1x3/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0c_1x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0d_3x1/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_2/Conv2d_0d_3x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7b/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_7b/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_0/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_0/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_1/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_1/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_1/Conv2d_0b_1x3/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_1/Conv2d_0b_1x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_1/Conv2d_0c_3x1/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_1/Conv2d_0c_3x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0a_1x1/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0a_1x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0b_3x3/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0b_3x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0c_1x3/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0c_1x3/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0d_3x1/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_2/Conv2d_0d_3x1/BatchNorm/beta:0 ******
*** InceptionV3/Mixed_7c/Branch_3/Conv2d_0b_1x1/weights:0 ******
*** InceptionV3/Mixed_7c/Branch_3/Conv2d_0b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/AuxLogits/Conv2d_1b_1x1/weights:0 ******
*** InceptionV3/AuxLogits/Conv2d_1b_1x1/BatchNorm/beta:0 ******
*** InceptionV3/AuxLogits/Conv2d_2a_5x5/weights:0 ******
*** InceptionV3/AuxLogits/Conv2d_2a_5x5/BatchNorm/beta:0 ******
*** InceptionV3/AuxLogits/Conv2d_2b_1x1/weights:0 ******
*** InceptionV3/AuxLogits/Conv2d_2b_1x1/biases:0 ******
*** InceptionV3/Logits/Conv2d_1c_1x1/weights:0 ******
*** InceptionV3/Logits/Conv2d_1c_1x1/biases:0 ******
*** InceptionV3/embedding/Conv2d_1c_1x1/weights:0 ******
*** InceptionV3/embedding/Conv2d_1c_1x1/weights:0 ******
*** InceptionV3/embedding/Conv2d_1c_1x1/biases:0 ******
*** InceptionV3/embedding/Conv2d_1c_1x1/biases:0 ******
*** InceptionV3/embedding/Conv2d_1d_1x1/weights:0 ******
*** InceptionV3/embedding/Conv2d_1d_1x1/weights:0 ******
*** InceptionV3/embedding/Conv2d_1d_1x1/biases:0 ******
*** InceptionV3/embedding/Conv2d_1d_1x1/biases:0 ******
Recall@1: 0.29743
Recall@2: 0.40834
Recall@4: 0.53393
Recall@8: 0.66745
Recall@16: 0.79136
Recall@32: 0.88049
In side function n_clusters = 100
sampler_num_per_class = [50. 60. 60. 60. 49. 60. 59. 60. 60. 60. 60. 60. 50. 60. 59. 60. 59. 60.
59. 60. 60. 60. 60. 59. 59. 59. 60. 60. 60. 60. 60. 60. 60. 60. 59. 60.
60. 60. 60. 60. 58. 60. 60. 60. 60. 60. 60. 60. 59. 60. 51. 60. 59. 60.
60. 60. 59. 60. 60. 59. 60. 60. 60. 60. 60. 59. 60. 59. 59. 60. 60. 60.
60. 60. 60. 60. 60. 56. 59. 60. 59. 60. 60. 60. 60. 60. 50. 60. 60. 58.
60. 60. 60. 60. 60. 59. 60. 60. 60. 60.]
sampler_num_per_cluster = [ 27. 94. 47. 94. 50. 106. 83. 107. 48. 33. 90. 113. 55. 41.
53. 72. 44. 33. 14. 78. 58. 44. 42. 93. 60. 74. 66. 16.
28. 74. 78. 77. 64. 89. 81. 47. 53. 33. 39. 65. 42. 41.
30. 96. 89. 71. 91. 39. 117. 59. 77. 73. 48. 54. 18. 46.
59. 72. 52. 26. 32. 81. 62. 75. 28. 29. 47. 92. 47. 44.
73. 50. 76. 108. 61. 62. 80. 91. 99. 22. 78. 34. 75. 41.
48. 16. 9. 85. 79. 38. 83. 37. 52. 55. 10. 11. 38. 80.
96. 37.]
Purity is 0.228
count_cross = [[0. 0. 0. ... 0. 0. 0.]
[0. 2. 0. ... 0. 0. 6.]
[0. 0. 1. ... 0. 0. 0.]
...
[0. 9. 5. ... 1. 0. 0.]
[0. 0. 0. ... 0. 0. 6.]
[0. 0. 0. ... 0. 0. 0.]]
Mutual information is 1.98683
5924.0
5924
Entropy cluster is 4.50334
Entropy class is 4.60444
normalized_mutual_information is 0.43629
tp_and_fp = 205810.0
tp = 20019.0
fp is 185791.0
fn is 152731.0
RI is 0.9807043189762656
Precision is 0.09726932607745008
Recall is 0.11588422575976845
F_1 is 0.10576394759087067
normalized_mutual_information = 0.43629366693315863
RI = 0.9807043189762656
F_1 = 0.10576394759087067
The NN is 0.29743
The FT is 0.12791
The ST is 0.19360
The DCG is 0.50845
The E is 0.11098
The MAP 0.09559
2018-10-22 10:29:47.095760: Epoch [ 0/1000] [ 20/183], total loss: 0.00000, regularization loss: 0.29028, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:29:57.102962: Epoch [ 0/1000] [ 40/183], total loss: 0.00000, regularization loss: 0.29028, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:30:07.106327: Epoch [ 0/1000] [ 60/183], total loss: 0.00000, regularization loss: 0.29029, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:30:17.173127: Epoch [ 0/1000] [ 80/183], total loss: 0.00000, regularization loss: 0.29029, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:30:27.268290: Epoch [ 0/1000] [100/183], total loss: 0.00000, regularization loss: 0.29029, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:30:37.364981: Epoch [ 0/1000] [120/183], total loss: 0.00307, regularization loss: 0.29028, contrastive loss: 0.00307, Loss positive: 0.00000, Loss negative: 0.00307
2018-10-22 10:30:47.515846: Epoch [ 0/1000] [140/183], total loss: 0.00014, regularization loss: 0.29028, contrastive loss: 0.00014, Loss positive: 0.00000, Loss negative: 0.00014
2018-10-22 10:30:57.701827: Epoch [ 0/1000] [160/183], total loss: 0.00000, regularization loss: 0.29028, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:31:07.874505: Epoch [ 0/1000] [180/183], total loss: 0.00064, regularization loss: 0.29028, contrastive loss: 0.00064, Loss positive: 0.00000, Loss negative: 0.00064
Recall@1: 0.07056
Recall@2: 0.11006
Recall@4: 0.18720
Recall@8: 0.29321
Recall@16: 0.43079
Recall@32: 0.59993
In side function n_clusters = 100
sampler_num_per_class = [50. 60. 60. 60. 49. 60. 59. 60. 60. 60. 60. 60. 50. 60. 59. 60. 59. 60.
59. 60. 60. 60. 60. 59. 59. 59. 60. 60. 60. 60. 60. 60. 60. 60. 59. 60.
60. 60. 60. 60. 58. 60. 60. 60. 60. 60. 60. 60. 59. 60. 51. 60. 59. 60.
60. 60. 59. 60. 60. 59. 60. 60. 60. 60. 60. 59. 60. 59. 59. 60. 60. 60.
60. 60. 60. 60. 60. 56. 59. 60. 59. 60. 60. 60. 60. 60. 50. 60. 60. 58.
60. 60. 60. 60. 60. 59. 60. 60. 60. 60.]
sampler_num_per_cluster = [ 29. 28. 116. 107. 72. 30. 62. 31. 65. 57. 63. 81. 55. 58.
44. 93. 70. 83. 95. 78. 64. 50. 50. 67. 60. 47. 30. 66.
66. 61. 81. 43. 31. 73. 42. 55. 64. 111. 50. 44. 27. 54.
65. 19. 87. 47. 54. 41. 76. 44. 41. 30. 61. 91. 103. 91.
46. 108. 79. 42. 84. 47. 69. 57. 60. 33. 57. 42. 46. 57.
69. 65. 56. 100. 39. 101. 55. 42. 68. 15. 16. 50. 29. 38.
55. 73. 86. 20. 66. 64. 48. 33. 83. 97. 41. 61. 31. 81.
33. 79.]
Purity is 0.117
count_cross = [[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]
[0. 1. 0. ... 1. 1. 0.]
...
[0. 0. 1. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 1.]
[0. 3. 3. ... 0. 1. 0.]]
Mutual information is 1.31250
5924.0
5924
Entropy cluster is 4.52909
Entropy class is 4.60444
normalized_mutual_information is 0.28740
tp_and_fp = 198714.0
tp = 6438.0
fp is 192276.0
fn is 166312.0
RI is 0.9795605613019571
Precision is 0.032398321205350404
Recall is 0.037267727930535455
F_1 is 0.034662847543772746
normalized_mutual_information = 0.28740154896775394
RI = 0.9795605613019571
F_1 = 0.034662847543772746
The NN is 0.07056
The FT is 0.04282
The ST is 0.07334
The DCG is 0.40824
The E is 0.03439
The MAP 0.03189
2018-10-22 10:32:53.637476: Epoch [ 1/1000] [ 20/183], total loss: 0.00258, regularization loss: 0.29028, contrastive loss: 0.00258, Loss positive: 0.00000, Loss negative: 0.00258
2018-10-22 10:33:03.738269: Epoch [ 1/1000] [ 40/183], total loss: 0.00034, regularization loss: 0.29028, contrastive loss: 0.00034, Loss positive: 0.00000, Loss negative: 0.00034
2018-10-22 10:33:13.903756: Epoch [ 1/1000] [ 60/183], total loss: 0.04459, regularization loss: 0.29028, contrastive loss: 0.04459, Loss positive: 0.03153, Loss negative: 0.01307
2018-10-22 10:33:24.093029: Epoch [ 1/1000] [ 80/183], total loss: 0.00172, regularization loss: 0.29027, contrastive loss: 0.00172, Loss positive: 0.00000, Loss negative: 0.00172
2018-10-22 10:33:34.306193: Epoch [ 1/1000] [100/183], total loss: 0.00075, regularization loss: 0.29027, contrastive loss: 0.00075, Loss positive: 0.00000, Loss negative: 0.00075
2018-10-22 10:33:44.513736: Epoch [ 1/1000] [120/183], total loss: 0.00000, regularization loss: 0.29027, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:33:54.718783: Epoch [ 1/1000] [140/183], total loss: 0.00090, regularization loss: 0.29027, contrastive loss: 0.00090, Loss positive: 0.00000, Loss negative: 0.00090
2018-10-22 10:34:04.955308: Epoch [ 1/1000] [160/183], total loss: 0.08142, regularization loss: 0.29027, contrastive loss: 0.08142, Loss positive: 0.06984, Loss negative: 0.01158
2018-10-22 10:34:15.188704: Epoch [ 1/1000] [180/183], total loss: 0.15240, regularization loss: 0.29027, contrastive loss: 0.15240, Loss positive: 0.15137, Loss negative: 0.00103
2018-10-22 10:34:39.161866: Epoch [ 2/1000] [ 20/183], total loss: 0.04136, regularization loss: 0.29027, contrastive loss: 0.04136, Loss positive: 0.04012, Loss negative: 0.00124
2018-10-22 10:34:49.393309: Epoch [ 2/1000] [ 40/183], total loss: 0.21839, regularization loss: 0.29027, contrastive loss: 0.21839, Loss positive: 0.21531, Loss negative: 0.00308
2018-10-22 10:34:59.562422: Epoch [ 2/1000] [ 60/183], total loss: 0.13641, regularization loss: 0.29027, contrastive loss: 0.13641, Loss positive: 0.12889, Loss negative: 0.00752
2018-10-22 10:35:09.691310: Epoch [ 2/1000] [ 80/183], total loss: 0.00438, regularization loss: 0.29027, contrastive loss: 0.00438, Loss positive: 0.00000, Loss negative: 0.00438
2018-10-22 10:35:19.876170: Epoch [ 2/1000] [100/183], total loss: 0.00606, regularization loss: 0.29027, contrastive loss: 0.00606, Loss positive: 0.00000, Loss negative: 0.00606
2018-10-22 10:35:30.081464: Epoch [ 2/1000] [120/183], total loss: 0.00000, regularization loss: 0.29027, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:35:40.368954: Epoch [ 2/1000] [140/183], total loss: 0.01376, regularization loss: 0.29027, contrastive loss: 0.01376, Loss positive: 0.01235, Loss negative: 0.00141
2018-10-22 10:35:50.590202: Epoch [ 2/1000] [160/183], total loss: 0.00000, regularization loss: 0.29027, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:36:00.844219: Epoch [ 2/1000] [180/183], total loss: 0.00077, regularization loss: 0.29027, contrastive loss: 0.00077, Loss positive: 0.00000, Loss negative: 0.00077
2018-10-22 10:36:23.352372: Epoch [ 3/1000] [ 20/183], total loss: 0.00000, regularization loss: 0.29027, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:36:33.467454: Epoch [ 3/1000] [ 40/183], total loss: 0.00705, regularization loss: 0.29027, contrastive loss: 0.00705, Loss positive: 0.00000, Loss negative: 0.00705
2018-10-22 10:36:43.630359: Epoch [ 3/1000] [ 60/183], total loss: 0.00014, regularization loss: 0.29027, contrastive loss: 0.00014, Loss positive: 0.00000, Loss negative: 0.00014
2018-10-22 10:36:53.850405: Epoch [ 3/1000] [ 80/183], total loss: 0.00266, regularization loss: 0.29027, contrastive loss: 0.00266, Loss positive: 0.00000, Loss negative: 0.00266
2018-10-22 10:37:04.077053: Epoch [ 3/1000] [100/183], total loss: 0.02895, regularization loss: 0.29027, contrastive loss: 0.02895, Loss positive: 0.02815, Loss negative: 0.00080
2018-10-22 10:37:14.325432: Epoch [ 3/1000] [120/183], total loss: 0.00285, regularization loss: 0.29027, contrastive loss: 0.00285, Loss positive: 0.00000, Loss negative: 0.00285
2018-10-22 10:37:24.649927: Epoch [ 3/1000] [140/183], total loss: 0.07570, regularization loss: 0.29027, contrastive loss: 0.07570, Loss positive: 0.07459, Loss negative: 0.00111
2018-10-22 10:37:35.032088: Epoch [ 3/1000] [160/183], total loss: 0.00668, regularization loss: 0.29027, contrastive loss: 0.00668, Loss positive: 0.00000, Loss negative: 0.00668
2018-10-22 10:37:45.292068: Epoch [ 3/1000] [180/183], total loss: 0.02530, regularization loss: 0.29027, contrastive loss: 0.02530, Loss positive: 0.01414, Loss negative: 0.01116
2018-10-22 10:38:07.530263: Epoch [ 4/1000] [ 20/183], total loss: 0.00004, regularization loss: 0.29027, contrastive loss: 0.00004, Loss positive: 0.00000, Loss negative: 0.00004
2018-10-22 10:38:17.717332: Epoch [ 4/1000] [ 40/183], total loss: 0.00358, regularization loss: 0.29026, contrastive loss: 0.00358, Loss positive: 0.00000, Loss negative: 0.00358
2018-10-22 10:38:27.915274: Epoch [ 4/1000] [ 60/183], total loss: 0.00241, regularization loss: 0.29026, contrastive loss: 0.00241, Loss positive: 0.00000, Loss negative: 0.00241
2018-10-22 10:38:38.151377: Epoch [ 4/1000] [ 80/183], total loss: 0.00867, regularization loss: 0.29026, contrastive loss: 0.00867, Loss positive: 0.00000, Loss negative: 0.00867
2018-10-22 10:38:48.368553: Epoch [ 4/1000] [100/183], total loss: 0.00340, regularization loss: 0.29027, contrastive loss: 0.00340, Loss positive: 0.00000, Loss negative: 0.00340
2018-10-22 10:38:58.580593: Epoch [ 4/1000] [120/183], total loss: 0.00243, regularization loss: 0.29026, contrastive loss: 0.00243, Loss positive: 0.00000, Loss negative: 0.00243
2018-10-22 10:39:09.052658: Epoch [ 4/1000] [140/183], total loss: 0.00285, regularization loss: 0.29026, contrastive loss: 0.00285, Loss positive: 0.00000, Loss negative: 0.00285
2018-10-22 10:39:19.464916: Epoch [ 4/1000] [160/183], total loss: 0.00000, regularization loss: 0.29026, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:39:29.810717: Epoch [ 4/1000] [180/183], total loss: 0.10888, regularization loss: 0.29026, contrastive loss: 0.10888, Loss positive: 0.10594, Loss negative: 0.00294
2018-10-22 10:39:52.163938: Epoch [ 5/1000] [ 20/183], total loss: 0.00433, regularization loss: 0.29026, contrastive loss: 0.00433, Loss positive: 0.00000, Loss negative: 0.00433
2018-10-22 10:40:02.333270: Epoch [ 5/1000] [ 40/183], total loss: 0.00281, regularization loss: 0.29026, contrastive loss: 0.00281, Loss positive: 0.00000, Loss negative: 0.00281
2018-10-22 10:40:12.544936: Epoch [ 5/1000] [ 60/183], total loss: 0.00001, regularization loss: 0.29026, contrastive loss: 0.00001, Loss positive: 0.00000, Loss negative: 0.00001
2018-10-22 10:40:22.743872: Epoch [ 5/1000] [ 80/183], total loss: 0.07717, regularization loss: 0.29026, contrastive loss: 0.07717, Loss positive: 0.07460, Loss negative: 0.00257
2018-10-22 10:40:32.995636: Epoch [ 5/1000] [100/183], total loss: 0.00210, regularization loss: 0.29026, contrastive loss: 0.00210, Loss positive: 0.00000, Loss negative: 0.00210
2018-10-22 10:40:43.219217: Epoch [ 5/1000] [120/183], total loss: 0.00712, regularization loss: 0.29026, contrastive loss: 0.00712, Loss positive: 0.00000, Loss negative: 0.00712
2018-10-22 10:40:53.789157: Epoch [ 5/1000] [140/183], total loss: 0.04842, regularization loss: 0.29026, contrastive loss: 0.04842, Loss positive: 0.04115, Loss negative: 0.00727
2018-10-22 10:41:04.059098: Epoch [ 5/1000] [160/183], total loss: 0.00083, regularization loss: 0.29026, contrastive loss: 0.00083, Loss positive: 0.00000, Loss negative: 0.00083
2018-10-22 10:41:14.302505: Epoch [ 5/1000] [180/183], total loss: 0.00003, regularization loss: 0.29026, contrastive loss: 0.00003, Loss positive: 0.00000, Loss negative: 0.00003
2018-10-22 10:41:38.311791: Epoch [ 6/1000] [ 20/183], total loss: 0.00552, regularization loss: 0.29026, contrastive loss: 0.00552, Loss positive: 0.00000, Loss negative: 0.00552
2018-10-22 10:41:48.520969: Epoch [ 6/1000] [ 40/183], total loss: 0.00245, regularization loss: 0.29025, contrastive loss: 0.00245, Loss positive: 0.00000, Loss negative: 0.00245
2018-10-22 10:41:58.741023: Epoch [ 6/1000] [ 60/183], total loss: 0.10036, regularization loss: 0.29025, contrastive loss: 0.10036, Loss positive: 0.10006, Loss negative: 0.00030
2018-10-22 10:42:08.973994: Epoch [ 6/1000] [ 80/183], total loss: 0.00857, regularization loss: 0.29025, contrastive loss: 0.00857, Loss positive: 0.00000, Loss negative: 0.00857
2018-10-22 10:42:19.308161: Epoch [ 6/1000] [100/183], total loss: 0.00078, regularization loss: 0.29025, contrastive loss: 0.00078, Loss positive: 0.00000, Loss negative: 0.00078
2018-10-22 10:42:29.541354: Epoch [ 6/1000] [120/183], total loss: 0.00250, regularization loss: 0.29026, contrastive loss: 0.00250, Loss positive: 0.00000, Loss negative: 0.00250
2018-10-22 10:42:40.111223: Epoch [ 6/1000] [140/183], total loss: 0.00000, regularization loss: 0.29026, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:42:50.420945: Epoch [ 6/1000] [160/183], total loss: 0.00102, regularization loss: 0.29025, contrastive loss: 0.00102, Loss positive: 0.00000, Loss negative: 0.00102
2018-10-22 10:43:00.803190: Epoch [ 6/1000] [180/183], total loss: 0.00302, regularization loss: 0.29025, contrastive loss: 0.00302, Loss positive: 0.00000, Loss negative: 0.00302
2018-10-22 10:43:24.741653: Epoch [ 7/1000] [ 20/183], total loss: 0.00272, regularization loss: 0.29025, contrastive loss: 0.00272, Loss positive: 0.00000, Loss negative: 0.00272
2018-10-22 10:43:34.934801: Epoch [ 7/1000] [ 40/183], total loss: 0.00198, regularization loss: 0.29025, contrastive loss: 0.00198, Loss positive: 0.00000, Loss negative: 0.00198
2018-10-22 10:43:45.192828: Epoch [ 7/1000] [ 60/183], total loss: 0.06829, regularization loss: 0.29025, contrastive loss: 0.06829, Loss positive: 0.06305, Loss negative: 0.00524
2018-10-22 10:43:55.415601: Epoch [ 7/1000] [ 80/183], total loss: 0.06046, regularization loss: 0.29025, contrastive loss: 0.06046, Loss positive: 0.05691, Loss negative: 0.00355
2018-10-22 10:44:05.824558: Epoch [ 7/1000] [100/183], total loss: 0.00001, regularization loss: 0.29025, contrastive loss: 0.00001, Loss positive: 0.00000, Loss negative: 0.00001
2018-10-22 10:44:16.065052: Epoch [ 7/1000] [120/183], total loss: 0.00228, regularization loss: 0.29025, contrastive loss: 0.00228, Loss positive: 0.00000, Loss negative: 0.00228
2018-10-22 10:44:26.392742: Epoch [ 7/1000] [140/183], total loss: 0.03356, regularization loss: 0.29025, contrastive loss: 0.03356, Loss positive: 0.02649, Loss negative: 0.00706
2018-10-22 10:44:36.650294: Epoch [ 7/1000] [160/183], total loss: 0.00682, regularization loss: 0.29025, contrastive loss: 0.00682, Loss positive: 0.00000, Loss negative: 0.00682
2018-10-22 10:44:47.093908: Epoch [ 7/1000] [180/183], total loss: 0.04538, regularization loss: 0.29025, contrastive loss: 0.04538, Loss positive: 0.03423, Loss negative: 0.01115
2018-10-22 10:45:09.536614: Epoch [ 8/1000] [ 20/183], total loss: 0.00255, regularization loss: 0.29025, contrastive loss: 0.00255, Loss positive: 0.00000, Loss negative: 0.00255
2018-10-22 10:45:19.740050: Epoch [ 8/1000] [ 40/183], total loss: 0.04956, regularization loss: 0.29025, contrastive loss: 0.04956, Loss positive: 0.04391, Loss negative: 0.00566
2018-10-22 10:45:29.905985: Epoch [ 8/1000] [ 60/183], total loss: 0.05340, regularization loss: 0.29025, contrastive loss: 0.05340, Loss positive: 0.05170, Loss negative: 0.00171
2018-10-22 10:45:40.107571: Epoch [ 8/1000] [ 80/183], total loss: 0.00113, regularization loss: 0.29025, contrastive loss: 0.00113, Loss positive: 0.00000, Loss negative: 0.00113
2018-10-22 10:45:50.372749: Epoch [ 8/1000] [100/183], total loss: 0.07396, regularization loss: 0.29025, contrastive loss: 0.07396, Loss positive: 0.07394, Loss negative: 0.00002
2018-10-22 10:46:00.651822: Epoch [ 8/1000] [120/183], total loss: 0.00174, regularization loss: 0.29024, contrastive loss: 0.00174, Loss positive: 0.00000, Loss negative: 0.00174
2018-10-22 10:46:10.937020: Epoch [ 8/1000] [140/183], total loss: 0.01707, regularization loss: 0.29024, contrastive loss: 0.01707, Loss positive: 0.01367, Loss negative: 0.00340
2018-10-22 10:46:21.204444: Epoch [ 8/1000] [160/183], total loss: 0.00804, regularization loss: 0.29024, contrastive loss: 0.00804, Loss positive: 0.00000, Loss negative: 0.00804
2018-10-22 10:46:31.491973: Epoch [ 8/1000] [180/183], total loss: 0.00141, regularization loss: 0.29024, contrastive loss: 0.00141, Loss positive: 0.00000, Loss negative: 0.00141
2018-10-22 10:46:54.110334: Epoch [ 9/1000] [ 20/183], total loss: 0.00000, regularization loss: 0.29024, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:47:04.308340: Epoch [ 9/1000] [ 40/183], total loss: 0.00182, regularization loss: 0.29024, contrastive loss: 0.00182, Loss positive: 0.00000, Loss negative: 0.00182
2018-10-22 10:47:14.507345: Epoch [ 9/1000] [ 60/183], total loss: 0.00541, regularization loss: 0.29024, contrastive loss: 0.00541, Loss positive: 0.00000, Loss negative: 0.00541
2018-10-22 10:47:24.750234: Epoch [ 9/1000] [ 80/183], total loss: 0.00135, regularization loss: 0.29024, contrastive loss: 0.00135, Loss positive: 0.00000, Loss negative: 0.00135
2018-10-22 10:47:35.016184: Epoch [ 9/1000] [100/183], total loss: 0.04257, regularization loss: 0.29024, contrastive loss: 0.04257, Loss positive: 0.03726, Loss negative: 0.00530
2018-10-22 10:47:45.251019: Epoch [ 9/1000] [120/183], total loss: 0.00250, regularization loss: 0.29024, contrastive loss: 0.00250, Loss positive: 0.00000, Loss negative: 0.00250
2018-10-22 10:47:55.580348: Epoch [ 9/1000] [140/183], total loss: 0.02551, regularization loss: 0.29024, contrastive loss: 0.02551, Loss positive: 0.02080, Loss negative: 0.00471
2018-10-22 10:48:05.793540: Epoch [ 9/1000] [160/183], total loss: 0.00123, regularization loss: 0.29024, contrastive loss: 0.00123, Loss positive: 0.00000, Loss negative: 0.00123
2018-10-22 10:48:16.035988: Epoch [ 9/1000] [180/183], total loss: 0.01645, regularization loss: 0.29024, contrastive loss: 0.01645, Loss positive: 0.01331, Loss negative: 0.00315
2018-10-22 10:48:36.608173: Epoch [ 10/1000] [ 20/183], total loss: 0.00156, regularization loss: 0.29024, contrastive loss: 0.00156, Loss positive: 0.00000, Loss negative: 0.00156
2018-10-22 10:48:46.774034: Epoch [ 10/1000] [ 40/183], total loss: 0.01241, regularization loss: 0.29024, contrastive loss: 0.01241, Loss positive: 0.00000, Loss negative: 0.01241
2018-10-22 10:48:56.902550: Epoch [ 10/1000] [ 60/183], total loss: 0.02291, regularization loss: 0.29023, contrastive loss: 0.02291, Loss positive: 0.02253, Loss negative: 0.00038
2018-10-22 10:49:07.460718: Epoch [ 10/1000] [ 80/183], total loss: 0.04577, regularization loss: 0.29024, contrastive loss: 0.04577, Loss positive: 0.04494, Loss negative: 0.00083
2018-10-22 10:49:17.695578: Epoch [ 10/1000] [100/183], total loss: 0.04865, regularization loss: 0.29024, contrastive loss: 0.04865, Loss positive: 0.04778, Loss negative: 0.00087
2018-10-22 10:49:27.982781: Epoch [ 10/1000] [120/183], total loss: 0.00408, regularization loss: 0.29024, contrastive loss: 0.00408, Loss positive: 0.00000, Loss negative: 0.00408
2018-10-22 10:49:38.172584: Epoch [ 10/1000] [140/183], total loss: 0.00101, regularization loss: 0.29024, contrastive loss: 0.00101, Loss positive: 0.00000, Loss negative: 0.00101
2018-10-22 10:49:48.454816: Epoch [ 10/1000] [160/183], total loss: 0.00181, regularization loss: 0.29023, contrastive loss: 0.00181, Loss positive: 0.00000, Loss negative: 0.00181
2018-10-22 10:49:58.665045: Epoch [ 10/1000] [180/183], total loss: 0.00333, regularization loss: 0.29023, contrastive loss: 0.00333, Loss positive: 0.00000, Loss negative: 0.00333
Recall@1: 0.12441
Recall@2: 0.20392
Recall@4: 0.30402
Recall@8: 0.43805
Recall@16: 0.58120
Recall@32: 0.73666
In side function n_clusters = 100
sampler_num_per_class = [50. 60. 60. 60. 49. 60. 59. 60. 60. 60. 60. 60. 50. 60. 59. 60. 59. 60.
59. 60. 60. 60. 60. 59. 59. 59. 60. 60. 60. 60. 60. 60. 60. 60. 59. 60.
60. 60. 60. 60. 58. 60. 60. 60. 60. 60. 60. 60. 59. 60. 51. 60. 59. 60.
60. 60. 59. 60. 60. 59. 60. 60. 60. 60. 60. 59. 60. 59. 59. 60. 60. 60.
60. 60. 60. 60. 60. 56. 59. 60. 59. 60. 60. 60. 60. 60. 50. 60. 60. 58.
60. 60. 60. 60. 60. 59. 60. 60. 60. 60.]
sampler_num_per_cluster = [ 61. 86. 60. 27. 46. 69. 57. 79. 61. 60. 85. 70. 55. 38.
49. 41. 36. 50. 59. 41. 65. 37. 58. 68. 86. 55. 64. 64.
29. 22. 73. 79. 92. 51. 43. 41. 62. 70. 48. 57. 79. 53.
35. 79. 30. 66. 59. 87. 69. 65. 54. 36. 82. 81. 42. 61.
78. 51. 85. 57. 23. 33. 27. 45. 74. 78. 59. 63. 68. 83.
82. 39. 73. 70. 50. 74. 48. 65. 41. 80. 60. 24. 66. 83.
47. 69. 87. 68. 74. 101. 45. 79. 49. 61. 71. 45. 33. 70.
42. 32.]
Purity is 0.163
count_cross = [[ 0. 0. 1. ... 0. 0. 0.]
[ 0. 4. 5. ... 2. 0. 0.]
[ 0. 0. 0. ... 0. 0. 0.]
...
[ 0. 0. 0. ... 0. 0. 2.]
[ 0. 0. 0. ... 0. 0. 0.]
[31. 0. 0. ... 0. 0. 0.]]
Mutual information is 1.65385
5924.0
5924
Entropy cluster is 4.55684
Entropy class is 4.60444
normalized_mutual_information is 0.36105
tp_and_fp = 188611.0
tp = 11220.0
fp is 177391.0
fn is 161530.0
RI is 0.9806815760622793
Precision is 0.05948751663476679
Recall is 0.0649493487698987
F_1 is 0.06209856625369091
normalized_mutual_information = 0.361051153216403
RI = 0.9806815760622793
F_1 = 0.06209856625369091
The NN is 0.12441
The FT is 0.07438
The ST is 0.12483
The DCG is 0.45296
The E is 0.05950
The MAP 0.05754
2018-10-22 10:51:25.341533: Epoch [ 11/1000] [ 20/183], total loss: 0.00081, regularization loss: 0.29023, contrastive loss: 0.00081, Loss positive: 0.00000, Loss negative: 0.00081
2018-10-22 10:51:35.390243: Epoch [ 11/1000] [ 40/183], total loss: 0.02033, regularization loss: 0.29023, contrastive loss: 0.02033, Loss positive: 0.01961, Loss negative: 0.00071
2018-10-22 10:51:45.482366: Epoch [ 11/1000] [ 60/183], total loss: 0.00043, regularization loss: 0.29023, contrastive loss: 0.00043, Loss positive: 0.00000, Loss negative: 0.00043
2018-10-22 10:51:55.573700: Epoch [ 11/1000] [ 80/183], total loss: 0.00006, regularization loss: 0.29023, contrastive loss: 0.00006, Loss positive: 0.00000, Loss negative: 0.00006
2018-10-22 10:52:05.706940: Epoch [ 11/1000] [100/183], total loss: 0.03184, regularization loss: 0.29023, contrastive loss: 0.03184, Loss positive: 0.02989, Loss negative: 0.00195
2018-10-22 10:52:15.838743: Epoch [ 11/1000] [120/183], total loss: 0.03736, regularization loss: 0.29023, contrastive loss: 0.03736, Loss positive: 0.03614, Loss negative: 0.00121
2018-10-22 10:52:26.106168: Epoch [ 11/1000] [140/183], total loss: 0.00042, regularization loss: 0.29023, contrastive loss: 0.00042, Loss positive: 0.00000, Loss negative: 0.00042
2018-10-22 10:52:36.439992: Epoch [ 11/1000] [160/183], total loss: 0.00488, regularization loss: 0.29023, contrastive loss: 0.00488, Loss positive: 0.00000, Loss negative: 0.00488
2018-10-22 10:52:46.769128: Epoch [ 11/1000] [180/183], total loss: 0.01622, regularization loss: 0.29023, contrastive loss: 0.01622, Loss positive: 0.01425, Loss negative: 0.00198
2018-10-22 10:53:08.567453: Epoch [ 12/1000] [ 20/183], total loss: 0.00169, regularization loss: 0.29023, contrastive loss: 0.00169, Loss positive: 0.00000, Loss negative: 0.00169
2018-10-22 10:53:18.658184: Epoch [ 12/1000] [ 40/183], total loss: 0.02193, regularization loss: 0.29023, contrastive loss: 0.02193, Loss positive: 0.02150, Loss negative: 0.00043
2018-10-22 10:53:28.802129: Epoch [ 12/1000] [ 60/183], total loss: 0.04250, regularization loss: 0.29023, contrastive loss: 0.04250, Loss positive: 0.03626, Loss negative: 0.00624
2018-10-22 10:53:38.955764: Epoch [ 12/1000] [ 80/183], total loss: 0.02744, regularization loss: 0.29023, contrastive loss: 0.02744, Loss positive: 0.02657, Loss negative: 0.00087
2018-10-22 10:53:49.216428: Epoch [ 12/1000] [100/183], total loss: 0.00173, regularization loss: 0.29023, contrastive loss: 0.00173, Loss positive: 0.00000, Loss negative: 0.00173
2018-10-22 10:53:59.405085: Epoch [ 12/1000] [120/183], total loss: 0.00010, regularization loss: 0.29023, contrastive loss: 0.00010, Loss positive: 0.00000, Loss negative: 0.00010
2018-10-22 10:54:09.664280: Epoch [ 12/1000] [140/183], total loss: 0.00177, regularization loss: 0.29023, contrastive loss: 0.00177, Loss positive: 0.00000, Loss negative: 0.00177
2018-10-22 10:54:19.892945: Epoch [ 12/1000] [160/183], total loss: 0.00044, regularization loss: 0.29023, contrastive loss: 0.00044, Loss positive: 0.00000, Loss negative: 0.00044
2018-10-22 10:54:30.133833: Epoch [ 12/1000] [180/183], total loss: 0.00078, regularization loss: 0.29023, contrastive loss: 0.00078, Loss positive: 0.00000, Loss negative: 0.00078
2018-10-22 10:54:50.716770: Epoch [ 13/1000] [ 20/183], total loss: 0.01035, regularization loss: 0.29023, contrastive loss: 0.01035, Loss positive: 0.00000, Loss negative: 0.01035
2018-10-22 10:55:00.854451: Epoch [ 13/1000] [ 40/183], total loss: 0.00128, regularization loss: 0.29023, contrastive loss: 0.00128, Loss positive: 0.00000, Loss negative: 0.00128
2018-10-22 10:55:11.027453: Epoch [ 13/1000] [ 60/183], total loss: 0.00053, regularization loss: 0.29023, contrastive loss: 0.00053, Loss positive: 0.00000, Loss negative: 0.00053
2018-10-22 10:55:21.220353: Epoch [ 13/1000] [ 80/183], total loss: 0.00104, regularization loss: 0.29023, contrastive loss: 0.00104, Loss positive: 0.00000, Loss negative: 0.00104
2018-10-22 10:55:31.482277: Epoch [ 13/1000] [100/183], total loss: 0.01952, regularization loss: 0.29023, contrastive loss: 0.01952, Loss positive: 0.01690, Loss negative: 0.00262
2018-10-22 10:55:41.704874: Epoch [ 13/1000] [120/183], total loss: 0.00148, regularization loss: 0.29023, contrastive loss: 0.00148, Loss positive: 0.00000, Loss negative: 0.00148
2018-10-22 10:55:51.938478: Epoch [ 13/1000] [140/183], total loss: 0.00005, regularization loss: 0.29023, contrastive loss: 0.00005, Loss positive: 0.00000, Loss negative: 0.00005
2018-10-22 10:56:02.392943: Epoch [ 13/1000] [160/183], total loss: 0.00133, regularization loss: 0.29023, contrastive loss: 0.00133, Loss positive: 0.00000, Loss negative: 0.00133
2018-10-22 10:56:12.640571: Epoch [ 13/1000] [180/183], total loss: 0.00252, regularization loss: 0.29022, contrastive loss: 0.00252, Loss positive: 0.00000, Loss negative: 0.00252
2018-10-22 10:56:33.211160: Epoch [ 14/1000] [ 20/183], total loss: 0.00032, regularization loss: 0.29023, contrastive loss: 0.00032, Loss positive: 0.00000, Loss negative: 0.00032
2018-10-22 10:56:43.346865: Epoch [ 14/1000] [ 40/183], total loss: 0.00056, regularization loss: 0.29023, contrastive loss: 0.00056, Loss positive: 0.00000, Loss negative: 0.00056
2018-10-22 10:56:53.480900: Epoch [ 14/1000] [ 60/183], total loss: 0.00203, regularization loss: 0.29023, contrastive loss: 0.00203, Loss positive: 0.00000, Loss negative: 0.00203
2018-10-22 10:57:03.804219: Epoch [ 14/1000] [ 80/183], total loss: 0.06635, regularization loss: 0.29023, contrastive loss: 0.06635, Loss positive: 0.06521, Loss negative: 0.00115
2018-10-22 10:57:13.968437: Epoch [ 14/1000] [100/183], total loss: 0.00344, regularization loss: 0.29023, contrastive loss: 0.00344, Loss positive: 0.00000, Loss negative: 0.00344
2018-10-22 10:57:24.317403: Epoch [ 14/1000] [120/183], total loss: 0.00000, regularization loss: 0.29023, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:57:34.823085: Epoch [ 14/1000] [140/183], total loss: 0.00010, regularization loss: 0.29022, contrastive loss: 0.00010, Loss positive: 0.00000, Loss negative: 0.00010
2018-10-22 10:57:45.032314: Epoch [ 14/1000] [160/183], total loss: 0.00370, regularization loss: 0.29022, contrastive loss: 0.00370, Loss positive: 0.00000, Loss negative: 0.00370
2018-10-22 10:57:55.276186: Epoch [ 14/1000] [180/183], total loss: 0.01846, regularization loss: 0.29022, contrastive loss: 0.01846, Loss positive: 0.01805, Loss negative: 0.00040
2018-10-22 10:58:15.824641: Epoch [ 15/1000] [ 20/183], total loss: 0.00747, regularization loss: 0.29022, contrastive loss: 0.00747, Loss positive: 0.00000, Loss negative: 0.00747
2018-10-22 10:58:25.929562: Epoch [ 15/1000] [ 40/183], total loss: 0.00000, regularization loss: 0.29022, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 10:58:36.031100: Epoch [ 15/1000] [ 60/183], total loss: 0.05970, regularization loss: 0.29022, contrastive loss: 0.05970, Loss positive: 0.05246, Loss negative: 0.00725
2018-10-22 10:58:46.266126: Epoch [ 15/1000] [ 80/183], total loss: 0.00120, regularization loss: 0.29022, contrastive loss: 0.00120, Loss positive: 0.00000, Loss negative: 0.00120
2018-10-22 10:58:56.512150: Epoch [ 15/1000] [100/183], total loss: 0.00071, regularization loss: 0.29022, contrastive loss: 0.00071, Loss positive: 0.00000, Loss negative: 0.00071
2018-10-22 10:59:06.753384: Epoch [ 15/1000] [120/183], total loss: 0.00035, regularization loss: 0.29022, contrastive loss: 0.00035, Loss positive: 0.00000, Loss negative: 0.00035
2018-10-22 10:59:16.946953: Epoch [ 15/1000] [140/183], total loss: 0.00090, regularization loss: 0.29022, contrastive loss: 0.00090, Loss positive: 0.00000, Loss negative: 0.00090
2018-10-22 10:59:27.198344: Epoch [ 15/1000] [160/183], total loss: 0.00398, regularization loss: 0.29022, contrastive loss: 0.00398, Loss positive: 0.00000, Loss negative: 0.00398
2018-10-22 10:59:37.421822: Epoch [ 15/1000] [180/183], total loss: 0.00067, regularization loss: 0.29022, contrastive loss: 0.00067, Loss positive: 0.00000, Loss negative: 0.00067
2018-10-22 10:59:58.099590: Epoch [ 16/1000] [ 20/183], total loss: 0.00000, regularization loss: 0.29023, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:00:08.215445: Epoch [ 16/1000] [ 40/183], total loss: 0.00090, regularization loss: 0.29023, contrastive loss: 0.00090, Loss positive: 0.00000, Loss negative: 0.00090
2018-10-22 11:00:18.352210: Epoch [ 16/1000] [ 60/183], total loss: 0.00000, regularization loss: 0.29023, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:00:28.528861: Epoch [ 16/1000] [ 80/183], total loss: 0.04929, regularization loss: 0.29023, contrastive loss: 0.04929, Loss positive: 0.04197, Loss negative: 0.00732
2018-10-22 11:00:38.784246: Epoch [ 16/1000] [100/183], total loss: 0.00024, regularization loss: 0.29022, contrastive loss: 0.00024, Loss positive: 0.00000, Loss negative: 0.00024
2018-10-22 11:00:49.019196: Epoch [ 16/1000] [120/183], total loss: 0.00173, regularization loss: 0.29022, contrastive loss: 0.00173, Loss positive: 0.00000, Loss negative: 0.00173
2018-10-22 11:00:59.267628: Epoch [ 16/1000] [140/183], total loss: 0.00395, regularization loss: 0.29022, contrastive loss: 0.00395, Loss positive: 0.00000, Loss negative: 0.00395
2018-10-22 11:01:09.512149: Epoch [ 16/1000] [160/183], total loss: 0.03043, regularization loss: 0.29022, contrastive loss: 0.03043, Loss positive: 0.02620, Loss negative: 0.00423
2018-10-22 11:01:19.971932: Epoch [ 16/1000] [180/183], total loss: 0.00176, regularization loss: 0.29022, contrastive loss: 0.00176, Loss positive: 0.00000, Loss negative: 0.00176
2018-10-22 11:01:41.459090: Epoch [ 17/1000] [ 20/183], total loss: 0.00000, regularization loss: 0.29022, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:01:51.600207: Epoch [ 17/1000] [ 40/183], total loss: 0.06622, regularization loss: 0.29022, contrastive loss: 0.06622, Loss positive: 0.06622, Loss negative: 0.00000
2018-10-22 11:02:01.708803: Epoch [ 17/1000] [ 60/183], total loss: 0.00003, regularization loss: 0.29022, contrastive loss: 0.00003, Loss positive: 0.00000, Loss negative: 0.00003
2018-10-22 11:02:11.823990: Epoch [ 17/1000] [ 80/183], total loss: 0.00498, regularization loss: 0.29022, contrastive loss: 0.00498, Loss positive: 0.00000, Loss negative: 0.00498
2018-10-22 11:02:22.172998: Epoch [ 17/1000] [100/183], total loss: 0.00018, regularization loss: 0.29023, contrastive loss: 0.00018, Loss positive: 0.00000, Loss negative: 0.00018
2018-10-22 11:02:32.442867: Epoch [ 17/1000] [120/183], total loss: 0.00164, regularization loss: 0.29022, contrastive loss: 0.00164, Loss positive: 0.00000, Loss negative: 0.00164
2018-10-22 11:02:42.687966: Epoch [ 17/1000] [140/183], total loss: 0.06488, regularization loss: 0.29022, contrastive loss: 0.06488, Loss positive: 0.06399, Loss negative: 0.00090
2018-10-22 11:02:52.931953: Epoch [ 17/1000] [160/183], total loss: 0.04509, regularization loss: 0.29022, contrastive loss: 0.04509, Loss positive: 0.04351, Loss negative: 0.00159
2018-10-22 11:03:03.163641: Epoch [ 17/1000] [180/183], total loss: 0.00002, regularization loss: 0.29022, contrastive loss: 0.00002, Loss positive: 0.00000, Loss negative: 0.00002
2018-10-22 11:03:24.666283: Epoch [ 18/1000] [ 20/183], total loss: 0.04610, regularization loss: 0.29022, contrastive loss: 0.04610, Loss positive: 0.04286, Loss negative: 0.00324
2018-10-22 11:03:34.786687: Epoch [ 18/1000] [ 40/183], total loss: 0.00732, regularization loss: 0.29022, contrastive loss: 0.00732, Loss positive: 0.00000, Loss negative: 0.00732
2018-10-22 11:03:44.891634: Epoch [ 18/1000] [ 60/183], total loss: 0.00060, regularization loss: 0.29022, contrastive loss: 0.00060, Loss positive: 0.00000, Loss negative: 0.00060
2018-10-22 11:03:55.276970: Epoch [ 18/1000] [ 80/183], total loss: 0.00033, regularization loss: 0.29022, contrastive loss: 0.00033, Loss positive: 0.00000, Loss negative: 0.00033
2018-10-22 11:04:05.670090: Epoch [ 18/1000] [100/183], total loss: 0.00001, regularization loss: 0.29022, contrastive loss: 0.00001, Loss positive: 0.00000, Loss negative: 0.00001
2018-10-22 11:04:15.873717: Epoch [ 18/1000] [120/183], total loss: 0.00000, regularization loss: 0.29022, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:04:26.231728: Epoch [ 18/1000] [140/183], total loss: 0.01623, regularization loss: 0.29022, contrastive loss: 0.01623, Loss positive: 0.01345, Loss negative: 0.00278
2018-10-22 11:04:36.472347: Epoch [ 18/1000] [160/183], total loss: 0.00771, regularization loss: 0.29022, contrastive loss: 0.00771, Loss positive: 0.00000, Loss negative: 0.00771
2018-10-22 11:04:46.805770: Epoch [ 18/1000] [180/183], total loss: 0.05539, regularization loss: 0.29022, contrastive loss: 0.05539, Loss positive: 0.05314, Loss negative: 0.00225
2018-10-22 11:05:07.419062: Epoch [ 19/1000] [ 20/183], total loss: 0.00020, regularization loss: 0.29022, contrastive loss: 0.00020, Loss positive: 0.00000, Loss negative: 0.00020
2018-10-22 11:05:17.538362: Epoch [ 19/1000] [ 40/183], total loss: 0.00259, regularization loss: 0.29022, contrastive loss: 0.00259, Loss positive: 0.00000, Loss negative: 0.00259
2018-10-22 11:05:27.657961: Epoch [ 19/1000] [ 60/183], total loss: 0.00108, regularization loss: 0.29022, contrastive loss: 0.00108, Loss positive: 0.00000, Loss negative: 0.00108
2018-10-22 11:05:37.806636: Epoch [ 19/1000] [ 80/183], total loss: 0.00057, regularization loss: 0.29022, contrastive loss: 0.00057, Loss positive: 0.00000, Loss negative: 0.00057
2018-10-22 11:05:48.052824: Epoch [ 19/1000] [100/183], total loss: 0.00017, regularization loss: 0.29022, contrastive loss: 0.00017, Loss positive: 0.00000, Loss negative: 0.00017
2018-10-22 11:05:58.291064: Epoch [ 19/1000] [120/183], total loss: 0.00060, regularization loss: 0.29022, contrastive loss: 0.00060, Loss positive: 0.00000, Loss negative: 0.00060
2018-10-22 11:06:08.838380: Epoch [ 19/1000] [140/183], total loss: 0.00892, regularization loss: 0.29022, contrastive loss: 0.00892, Loss positive: 0.00000, Loss negative: 0.00892
2018-10-22 11:06:19.172672: Epoch [ 19/1000] [160/183], total loss: 0.00270, regularization loss: 0.29022, contrastive loss: 0.00270, Loss positive: 0.00000, Loss negative: 0.00270
2018-10-22 11:06:29.437031: Epoch [ 19/1000] [180/183], total loss: 0.00256, regularization loss: 0.29022, contrastive loss: 0.00256, Loss positive: 0.00000, Loss negative: 0.00256
2018-10-22 11:06:50.050126: Epoch [ 20/1000] [ 20/183], total loss: 0.00424, regularization loss: 0.29022, contrastive loss: 0.00424, Loss positive: 0.00000, Loss negative: 0.00424
2018-10-22 11:07:00.163973: Epoch [ 20/1000] [ 40/183], total loss: 0.01279, regularization loss: 0.29022, contrastive loss: 0.01279, Loss positive: 0.01279, Loss negative: 0.00000
2018-10-22 11:07:10.299966: Epoch [ 20/1000] [ 60/183], total loss: 0.00115, regularization loss: 0.29022, contrastive loss: 0.00115, Loss positive: 0.00000, Loss negative: 0.00115
2018-10-22 11:07:20.417317: Epoch [ 20/1000] [ 80/183], total loss: 0.00034, regularization loss: 0.29022, contrastive loss: 0.00034, Loss positive: 0.00000, Loss negative: 0.00034
2018-10-22 11:07:30.667801: Epoch [ 20/1000] [100/183], total loss: 0.00241, regularization loss: 0.29022, contrastive loss: 0.00241, Loss positive: 0.00000, Loss negative: 0.00241
2018-10-22 11:07:40.975698: Epoch [ 20/1000] [120/183], total loss: 0.00011, regularization loss: 0.29022, contrastive loss: 0.00011, Loss positive: 0.00000, Loss negative: 0.00011
2018-10-22 11:07:51.222139: Epoch [ 20/1000] [140/183], total loss: 0.02469, regularization loss: 0.29022, contrastive loss: 0.02469, Loss positive: 0.02351, Loss negative: 0.00117
2018-10-22 11:08:01.492715: Epoch [ 20/1000] [160/183], total loss: 0.00210, regularization loss: 0.29022, contrastive loss: 0.00210, Loss positive: 0.00000, Loss negative: 0.00210
2018-10-22 11:08:11.734115: Epoch [ 20/1000] [180/183], total loss: 0.00007, regularization loss: 0.29022, contrastive loss: 0.00007, Loss positive: 0.00000, Loss negative: 0.00007
Recall@1: 0.14365
Recall@2: 0.22654
Recall@4: 0.33474
Recall@8: 0.47519
Recall@16: 0.62441
Recall@32: 0.77481
In side function n_clusters = 100
sampler_num_per_class = [50. 60. 60. 60. 49. 60. 59. 60. 60. 60. 60. 60. 50. 60. 59. 60. 59. 60.
59. 60. 60. 60. 60. 59. 59. 59. 60. 60. 60. 60. 60. 60. 60. 60. 59. 60.
60. 60. 60. 60. 58. 60. 60. 60. 60. 60. 60. 60. 59. 60. 51. 60. 59. 60.
60. 60. 59. 60. 60. 59. 60. 60. 60. 60. 60. 59. 60. 59. 59. 60. 60. 60.
60. 60. 60. 60. 60. 56. 59. 60. 59. 60. 60. 60. 60. 60. 50. 60. 60. 58.
60. 60. 60. 60. 60. 59. 60. 60. 60. 60.]
sampler_num_per_cluster = [32. 64. 50. 61. 79. 70. 62. 37. 71. 77. 62. 47. 43. 68. 57. 52. 95. 49.
71. 44. 58. 87. 88. 50. 49. 66. 55. 49. 85. 54. 42. 65. 54. 59. 73. 60.
57. 55. 45. 52. 62. 61. 58. 41. 80. 51. 39. 49. 89. 68. 48. 76. 70. 64.
54. 35. 56. 49. 62. 59. 60. 30. 49. 79. 47. 68. 41. 50. 30. 61. 51. 64.
79. 78. 42. 46. 34. 60. 38. 55. 83. 59. 36. 85. 33. 53. 76. 42. 68. 78.
58. 65. 64. 56. 96. 59. 74. 83. 81. 48.]
Purity is 0.160
count_cross = [[0. 0. 1. ... 0. 0. 0.]
[0. 0. 0. ... 0. 1. 0.]
[0. 0. 0. ... 0. 0. 0.]
...
[0. 0. 0. ... 0. 1. 1.]
[0. 2. 1. ... 0. 1. 1.]
[0. 0. 0. ... 0. 0. 0.]]
Mutual information is 1.68259
5924.0
5924
Entropy cluster is 4.57186
Entropy class is 4.60444
normalized_mutual_information is 0.36672
tp_and_fp = 184144.0
tp = 11385.0
fp is 172759.0
fn is 161365.0
RI is 0.9809550040281748
Precision is 0.06182661395429664
Recall is 0.06590448625180897
F_1 is 0.0638004561578508
normalized_mutual_information = 0.3667246593619587
RI = 0.9809550040281748
F_1 = 0.0638004561578508
The NN is 0.14365
The FT is 0.08039
The ST is 0.13231
The DCG is 0.45891
The E is 0.06563
The MAP 0.06025
2018-10-22 11:09:37.323791: Epoch [ 21/1000] [ 20/183], total loss: 0.07410, regularization loss: 0.29022, contrastive loss: 0.07410, Loss positive: 0.07409, Loss negative: 0.00001
2018-10-22 11:09:47.358720: Epoch [ 21/1000] [ 40/183], total loss: 0.05926, regularization loss: 0.29022, contrastive loss: 0.05926, Loss positive: 0.05741, Loss negative: 0.00185
2018-10-22 11:09:57.441417: Epoch [ 21/1000] [ 60/183], total loss: 0.00098, regularization loss: 0.29022, contrastive loss: 0.00098, Loss positive: 0.00000, Loss negative: 0.00098
2018-10-22 11:10:07.545732: Epoch [ 21/1000] [ 80/183], total loss: 0.00520, regularization loss: 0.29022, contrastive loss: 0.00520, Loss positive: 0.00000, Loss negative: 0.00520
2018-10-22 11:10:17.632980: Epoch [ 21/1000] [100/183], total loss: 0.00223, regularization loss: 0.29022, contrastive loss: 0.00223, Loss positive: 0.00000, Loss negative: 0.00223
2018-10-22 11:10:27.735300: Epoch [ 21/1000] [120/183], total loss: 0.00245, regularization loss: 0.29022, contrastive loss: 0.00245, Loss positive: 0.00000, Loss negative: 0.00245
2018-10-22 11:10:37.858802: Epoch [ 21/1000] [140/183], total loss: 0.00205, regularization loss: 0.29022, contrastive loss: 0.00205, Loss positive: 0.00000, Loss negative: 0.00205
2018-10-22 11:10:48.153297: Epoch [ 21/1000] [160/183], total loss: 0.00085, regularization loss: 0.29022, contrastive loss: 0.00085, Loss positive: 0.00000, Loss negative: 0.00085
2018-10-22 11:10:58.332155: Epoch [ 21/1000] [180/183], total loss: 0.00588, regularization loss: 0.29022, contrastive loss: 0.00588, Loss positive: 0.00000, Loss negative: 0.00588
2018-10-22 11:11:19.688348: Epoch [ 22/1000] [ 20/183], total loss: 0.00186, regularization loss: 0.29022, contrastive loss: 0.00186, Loss positive: 0.00000, Loss negative: 0.00186
2018-10-22 11:11:29.819772: Epoch [ 22/1000] [ 40/183], total loss: 0.00137, regularization loss: 0.29022, contrastive loss: 0.00137, Loss positive: 0.00000, Loss negative: 0.00137
2018-10-22 11:11:39.930385: Epoch [ 22/1000] [ 60/183], total loss: 0.01576, regularization loss: 0.29022, contrastive loss: 0.01576, Loss positive: 0.01568, Loss negative: 0.00007
2018-10-22 11:11:50.059969: Epoch [ 22/1000] [ 80/183], total loss: 0.09990, regularization loss: 0.29022, contrastive loss: 0.09990, Loss positive: 0.09972, Loss negative: 0.00018
2018-10-22 11:12:00.263736: Epoch [ 22/1000] [100/183], total loss: 0.02219, regularization loss: 0.29022, contrastive loss: 0.02219, Loss positive: 0.02219, Loss negative: 0.00000
2018-10-22 11:12:10.554990: Epoch [ 22/1000] [120/183], total loss: 0.00001, regularization loss: 0.29022, contrastive loss: 0.00001, Loss positive: 0.00000, Loss negative: 0.00001
2018-10-22 11:12:20.741903: Epoch [ 22/1000] [140/183], total loss: 0.00031, regularization loss: 0.29022, contrastive loss: 0.00031, Loss positive: 0.00000, Loss negative: 0.00031
2018-10-22 11:12:31.038900: Epoch [ 22/1000] [160/183], total loss: 0.00001, regularization loss: 0.29022, contrastive loss: 0.00001, Loss positive: 0.00000, Loss negative: 0.00001
2018-10-22 11:12:41.241303: Epoch [ 22/1000] [180/183], total loss: 0.00033, regularization loss: 0.29022, contrastive loss: 0.00033, Loss positive: 0.00000, Loss negative: 0.00033
2018-10-22 11:13:03.550287: Epoch [ 23/1000] [ 20/183], total loss: 0.00071, regularization loss: 0.29022, contrastive loss: 0.00071, Loss positive: 0.00000, Loss negative: 0.00071
2018-10-22 11:13:13.693347: Epoch [ 23/1000] [ 40/183], total loss: 0.00827, regularization loss: 0.29022, contrastive loss: 0.00827, Loss positive: 0.00000, Loss negative: 0.00827
2018-10-22 11:13:23.860246: Epoch [ 23/1000] [ 60/183], total loss: 0.00076, regularization loss: 0.29022, contrastive loss: 0.00076, Loss positive: 0.00000, Loss negative: 0.00076
2018-10-22 11:13:34.060706: Epoch [ 23/1000] [ 80/183], total loss: 0.00007, regularization loss: 0.29022, contrastive loss: 0.00007, Loss positive: 0.00000, Loss negative: 0.00007
2018-10-22 11:13:44.410316: Epoch [ 23/1000] [100/183], total loss: 0.00004, regularization loss: 0.29022, contrastive loss: 0.00004, Loss positive: 0.00000, Loss negative: 0.00004
2018-10-22 11:13:54.685965: Epoch [ 23/1000] [120/183], total loss: 0.03509, regularization loss: 0.29022, contrastive loss: 0.03509, Loss positive: 0.03427, Loss negative: 0.00082
2018-10-22 11:14:04.934847: Epoch [ 23/1000] [140/183], total loss: 0.00267, regularization loss: 0.29022, contrastive loss: 0.00267, Loss positive: 0.00000, Loss negative: 0.00267
2018-10-22 11:14:15.162170: Epoch [ 23/1000] [160/183], total loss: 0.00528, regularization loss: 0.29022, contrastive loss: 0.00528, Loss positive: 0.00000, Loss negative: 0.00528
2018-10-22 11:14:25.422213: Epoch [ 23/1000] [180/183], total loss: 0.00000, regularization loss: 0.29022, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:14:47.212350: Epoch [ 24/1000] [ 20/183], total loss: 0.00190, regularization loss: 0.29022, contrastive loss: 0.00190, Loss positive: 0.00000, Loss negative: 0.00190
2018-10-22 11:14:57.361575: Epoch [ 24/1000] [ 40/183], total loss: 0.00038, regularization loss: 0.29022, contrastive loss: 0.00038, Loss positive: 0.00000, Loss negative: 0.00038
2018-10-22 11:15:07.510454: Epoch [ 24/1000] [ 60/183], total loss: 0.00633, regularization loss: 0.29022, contrastive loss: 0.00633, Loss positive: 0.00000, Loss negative: 0.00633
2018-10-22 11:15:17.723175: Epoch [ 24/1000] [ 80/183], total loss: 0.00288, regularization loss: 0.29021, contrastive loss: 0.00288, Loss positive: 0.00000, Loss negative: 0.00288
2018-10-22 11:15:27.966545: Epoch [ 24/1000] [100/183], total loss: 0.00469, regularization loss: 0.29022, contrastive loss: 0.00469, Loss positive: 0.00000, Loss negative: 0.00469
2018-10-22 11:15:38.179477: Epoch [ 24/1000] [120/183], total loss: 0.03974, regularization loss: 0.29022, contrastive loss: 0.03974, Loss positive: 0.03908, Loss negative: 0.00066
2018-10-22 11:15:48.543602: Epoch [ 24/1000] [140/183], total loss: 0.00243, regularization loss: 0.29022, contrastive loss: 0.00243, Loss positive: 0.00000, Loss negative: 0.00243
2018-10-22 11:15:58.789508: Epoch [ 24/1000] [160/183], total loss: 0.00016, regularization loss: 0.29022, contrastive loss: 0.00016, Loss positive: 0.00000, Loss negative: 0.00016
2018-10-22 11:16:09.020123: Epoch [ 24/1000] [180/183], total loss: 0.00001, regularization loss: 0.29022, contrastive loss: 0.00001, Loss positive: 0.00000, Loss negative: 0.00001
2018-10-22 11:16:31.187254: Epoch [ 25/1000] [ 20/183], total loss: 0.03327, regularization loss: 0.29022, contrastive loss: 0.03327, Loss positive: 0.03095, Loss negative: 0.00232
2018-10-22 11:16:41.330640: Epoch [ 25/1000] [ 40/183], total loss: 0.01951, regularization loss: 0.29021, contrastive loss: 0.01951, Loss positive: 0.01704, Loss negative: 0.00247
2018-10-22 11:16:51.465480: Epoch [ 25/1000] [ 60/183], total loss: 0.00212, regularization loss: 0.29022, contrastive loss: 0.00212, Loss positive: 0.00000, Loss negative: 0.00212
2018-10-22 11:17:01.611105: Epoch [ 25/1000] [ 80/183], total loss: 0.00088, regularization loss: 0.29021, contrastive loss: 0.00088, Loss positive: 0.00000, Loss negative: 0.00088
2018-10-22 11:17:11.821330: Epoch [ 25/1000] [100/183], total loss: 0.00000, regularization loss: 0.29022, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:17:22.046142: Epoch [ 25/1000] [120/183], total loss: 0.00106, regularization loss: 0.29022, contrastive loss: 0.00106, Loss positive: 0.00000, Loss negative: 0.00106
2018-10-22 11:17:32.302149: Epoch [ 25/1000] [140/183], total loss: 0.00181, regularization loss: 0.29021, contrastive loss: 0.00181, Loss positive: 0.00000, Loss negative: 0.00181
2018-10-22 11:17:42.499939: Epoch [ 25/1000] [160/183], total loss: 0.04958, regularization loss: 0.29021, contrastive loss: 0.04958, Loss positive: 0.04837, Loss negative: 0.00121
2018-10-22 11:17:52.786414: Epoch [ 25/1000] [180/183], total loss: 0.00020, regularization loss: 0.29021, contrastive loss: 0.00020, Loss positive: 0.00000, Loss negative: 0.00020
2018-10-22 11:18:15.526632: Epoch [ 26/1000] [ 20/183], total loss: 0.00244, regularization loss: 0.29021, contrastive loss: 0.00244, Loss positive: 0.00000, Loss negative: 0.00244
2018-10-22 11:18:25.663083: Epoch [ 26/1000] [ 40/183], total loss: 0.00086, regularization loss: 0.29021, contrastive loss: 0.00086, Loss positive: 0.00000, Loss negative: 0.00086
2018-10-22 11:18:35.808159: Epoch [ 26/1000] [ 60/183], total loss: 0.03434, regularization loss: 0.29021, contrastive loss: 0.03434, Loss positive: 0.02807, Loss negative: 0.00627
2018-10-22 11:18:46.004802: Epoch [ 26/1000] [ 80/183], total loss: 0.04205, regularization loss: 0.29022, contrastive loss: 0.04205, Loss positive: 0.04045, Loss negative: 0.00160
2018-10-22 11:18:56.249630: Epoch [ 26/1000] [100/183], total loss: 0.02141, regularization loss: 0.29021, contrastive loss: 0.02141, Loss positive: 0.01951, Loss negative: 0.00190
2018-10-22 11:19:06.537286: Epoch [ 26/1000] [120/183], total loss: 0.00020, regularization loss: 0.29021, contrastive loss: 0.00020, Loss positive: 0.00000, Loss negative: 0.00020
2018-10-22 11:19:16.841208: Epoch [ 26/1000] [140/183], total loss: 0.03925, regularization loss: 0.29021, contrastive loss: 0.03925, Loss positive: 0.03215, Loss negative: 0.00709
2018-10-22 11:19:27.064183: Epoch [ 26/1000] [160/183], total loss: 0.00224, regularization loss: 0.29021, contrastive loss: 0.00224, Loss positive: 0.00000, Loss negative: 0.00224
2018-10-22 11:19:37.313218: Epoch [ 26/1000] [180/183], total loss: 0.00054, regularization loss: 0.29022, contrastive loss: 0.00054, Loss positive: 0.00000, Loss negative: 0.00054
2018-10-22 11:20:00.164887: Epoch [ 27/1000] [ 20/183], total loss: 0.00140, regularization loss: 0.29021, contrastive loss: 0.00140, Loss positive: 0.00000, Loss negative: 0.00140
2018-10-22 11:20:10.325025: Epoch [ 27/1000] [ 40/183], total loss: 0.04543, regularization loss: 0.29021, contrastive loss: 0.04543, Loss positive: 0.04289, Loss negative: 0.00254
2018-10-22 11:20:20.500789: Epoch [ 27/1000] [ 60/183], total loss: 0.00347, regularization loss: 0.29021, contrastive loss: 0.00347, Loss positive: 0.00000, Loss negative: 0.00347
2018-10-22 11:20:30.668657: Epoch [ 27/1000] [ 80/183], total loss: 0.04194, regularization loss: 0.29021, contrastive loss: 0.04194, Loss positive: 0.03799, Loss negative: 0.00396
2018-10-22 11:20:40.927541: Epoch [ 27/1000] [100/183], total loss: 0.00063, regularization loss: 0.29021, contrastive loss: 0.00063, Loss positive: 0.00000, Loss negative: 0.00063
2018-10-22 11:20:51.136159: Epoch [ 27/1000] [120/183], total loss: 0.00125, regularization loss: 0.29021, contrastive loss: 0.00125, Loss positive: 0.00000, Loss negative: 0.00125
2018-10-22 11:21:01.357889: Epoch [ 27/1000] [140/183], total loss: 0.02839, regularization loss: 0.29021, contrastive loss: 0.02839, Loss positive: 0.02839, Loss negative: 0.00000
2018-10-22 11:21:11.626434: Epoch [ 27/1000] [160/183], total loss: 0.00317, regularization loss: 0.29021, contrastive loss: 0.00317, Loss positive: 0.00000, Loss negative: 0.00317
2018-10-22 11:21:21.845025: Epoch [ 27/1000] [180/183], total loss: 0.04243, regularization loss: 0.29021, contrastive loss: 0.04243, Loss positive: 0.03909, Loss negative: 0.00334
2018-10-22 11:21:44.727705: Epoch [ 28/1000] [ 20/183], total loss: 0.00418, regularization loss: 0.29021, contrastive loss: 0.00418, Loss positive: 0.00000, Loss negative: 0.00418
2018-10-22 11:21:54.872150: Epoch [ 28/1000] [ 40/183], total loss: 0.01758, regularization loss: 0.29021, contrastive loss: 0.01758, Loss positive: 0.01720, Loss negative: 0.00038
2018-10-22 11:22:05.035320: Epoch [ 28/1000] [ 60/183], total loss: 0.00000, regularization loss: 0.29021, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:22:15.205105: Epoch [ 28/1000] [ 80/183], total loss: 0.03942, regularization loss: 0.29022, contrastive loss: 0.03942, Loss positive: 0.03855, Loss negative: 0.00087
2018-10-22 11:22:25.456203: Epoch [ 28/1000] [100/183], total loss: 0.00756, regularization loss: 0.29021, contrastive loss: 0.00756, Loss positive: 0.00000, Loss negative: 0.00756
2018-10-22 11:22:35.686314: Epoch [ 28/1000] [120/183], total loss: 0.00229, regularization loss: 0.29021, contrastive loss: 0.00229, Loss positive: 0.00000, Loss negative: 0.00229
2018-10-22 11:22:45.934286: Epoch [ 28/1000] [140/183], total loss: 0.00005, regularization loss: 0.29021, contrastive loss: 0.00005, Loss positive: 0.00000, Loss negative: 0.00005
2018-10-22 11:22:56.490975: Epoch [ 28/1000] [160/183], total loss: 0.00078, regularization loss: 0.29021, contrastive loss: 0.00078, Loss positive: 0.00000, Loss negative: 0.00078
2018-10-22 11:23:06.846476: Epoch [ 28/1000] [180/183], total loss: 0.00241, regularization loss: 0.29022, contrastive loss: 0.00241, Loss positive: 0.00000, Loss negative: 0.00241
2018-10-22 11:23:31.832684: Epoch [ 29/1000] [ 20/183], total loss: 0.05226, regularization loss: 0.29021, contrastive loss: 0.05226, Loss positive: 0.04878, Loss negative: 0.00348
2018-10-22 11:23:42.063528: Epoch [ 29/1000] [ 40/183], total loss: 0.00037, regularization loss: 0.29021, contrastive loss: 0.00037, Loss positive: 0.00000, Loss negative: 0.00037
2018-10-22 11:23:52.223356: Epoch [ 29/1000] [ 60/183], total loss: 0.00002, regularization loss: 0.29021, contrastive loss: 0.00002, Loss positive: 0.00000, Loss negative: 0.00002
2018-10-22 11:24:02.446409: Epoch [ 29/1000] [ 80/183], total loss: 0.05492, regularization loss: 0.29021, contrastive loss: 0.05492, Loss positive: 0.05210, Loss negative: 0.00282
2018-10-22 11:24:12.682251: Epoch [ 29/1000] [100/183], total loss: 0.03990, regularization loss: 0.29021, contrastive loss: 0.03990, Loss positive: 0.03840, Loss negative: 0.00150
2018-10-22 11:24:23.022174: Epoch [ 29/1000] [120/183], total loss: 0.00474, regularization loss: 0.29021, contrastive loss: 0.00474, Loss positive: 0.00000, Loss negative: 0.00474
2018-10-22 11:24:33.256458: Epoch [ 29/1000] [140/183], total loss: 0.00025, regularization loss: 0.29021, contrastive loss: 0.00025, Loss positive: 0.00000, Loss negative: 0.00025
2018-10-22 11:24:43.534629: Epoch [ 29/1000] [160/183], total loss: 0.03121, regularization loss: 0.29021, contrastive loss: 0.03121, Loss positive: 0.02955, Loss negative: 0.00166
2018-10-22 11:24:54.041340: Epoch [ 29/1000] [180/183], total loss: 0.00105, regularization loss: 0.29021, contrastive loss: 0.00105, Loss positive: 0.00000, Loss negative: 0.00105
2018-10-22 11:25:18.087066: Epoch [ 30/1000] [ 20/183], total loss: 0.00295, regularization loss: 0.29021, contrastive loss: 0.00295, Loss positive: 0.00000, Loss negative: 0.00295
2018-10-22 11:25:28.234967: Epoch [ 30/1000] [ 40/183], total loss: 0.00144, regularization loss: 0.29021, contrastive loss: 0.00144, Loss positive: 0.00000, Loss negative: 0.00144
2018-10-22 11:25:38.399114: Epoch [ 30/1000] [ 60/183], total loss: 0.00076, regularization loss: 0.29021, contrastive loss: 0.00076, Loss positive: 0.00000, Loss negative: 0.00076
2018-10-22 11:25:48.664013: Epoch [ 30/1000] [ 80/183], total loss: 0.00054, regularization loss: 0.29021, contrastive loss: 0.00054, Loss positive: 0.00000, Loss negative: 0.00054
2018-10-22 11:25:58.855706: Epoch [ 30/1000] [100/183], total loss: 0.02576, regularization loss: 0.29021, contrastive loss: 0.02576, Loss positive: 0.02560, Loss negative: 0.00016
2018-10-22 11:26:09.104541: Epoch [ 30/1000] [120/183], total loss: 0.00069, regularization loss: 0.29021, contrastive loss: 0.00069, Loss positive: 0.00000, Loss negative: 0.00069
2018-10-22 11:26:19.694327: Epoch [ 30/1000] [140/183], total loss: 0.04767, regularization loss: 0.29021, contrastive loss: 0.04767, Loss positive: 0.04765, Loss negative: 0.00002
2018-10-22 11:26:30.165801: Epoch [ 30/1000] [160/183], total loss: 0.00018, regularization loss: 0.29021, contrastive loss: 0.00018, Loss positive: 0.00000, Loss negative: 0.00018
2018-10-22 11:26:40.553908: Epoch [ 30/1000] [180/183], total loss: 0.00437, regularization loss: 0.29021, contrastive loss: 0.00437, Loss positive: 0.00000, Loss negative: 0.00437
Recall@1: 0.16188
Recall@2: 0.25473
Recall@4: 0.37373
Recall@8: 0.51772
Recall@16: 0.67083
Recall@32: 0.80149
In side function n_clusters = 100
sampler_num_per_class = [50. 60. 60. 60. 49. 60. 59. 60. 60. 60. 60. 60. 50. 60. 59. 60. 59. 60.
59. 60. 60. 60. 60. 59. 59. 59. 60. 60. 60. 60. 60. 60. 60. 60. 59. 60.
60. 60. 60. 60. 58. 60. 60. 60. 60. 60. 60. 60. 59. 60. 51. 60. 59. 60.
60. 60. 59. 60. 60. 59. 60. 60. 60. 60. 60. 59. 60. 59. 59. 60. 60. 60.
60. 60. 60. 60. 60. 56. 59. 60. 59. 60. 60. 60. 60. 60. 50. 60. 60. 58.
60. 60. 60. 60. 60. 59. 60. 60. 60. 60.]
sampler_num_per_cluster = [ 73. 71. 57. 70. 70. 63. 62. 33. 76. 63. 50. 99. 81. 51.
76. 39. 52. 45. 62. 39. 25. 79. 77. 73. 78. 44. 39. 13.
93. 42. 72. 66. 63. 45. 69. 74. 64. 30. 69. 38. 34. 77.
68. 78. 77. 53. 67. 69. 24. 96. 70. 63. 90. 37. 31. 29.
72. 68. 112. 38. 80. 47. 72. 42. 81. 57. 95. 45. 64. 67.
39. 49. 62. 34. 76. 44. 59. 55. 61. 48. 104. 52. 32. 66.
47. 54. 75. 84. 39. 38. 20. 36. 68. 52. 66. 58. 64. 48.
54. 52.]
Purity is 0.189
count_cross = [[ 0. 4. 3. ... 0. 0. 0.]
[ 0. 0. 0. ... 0. 0. 11.]
[ 0. 0. 0. ... 0. 0. 0.]
...
[ 3. 0. 0. ... 0. 0. 0.]
[ 0. 0. 1. ... 1. 0. 0.]
[ 0. 0. 0. ... 0. 0. 1.]]
Mutual information is 1.80265
5924.0
5924
Entropy cluster is 4.54967
Entropy class is 4.60444
normalized_mutual_information is 0.39385
tp_and_fp = 191163.0
tp = 14246.0
fp is 176917.0
fn is 158504.0
RI is 0.9808810753077731
Precision is 0.07452278945193369
Recall is 0.08246599131693198
F_1 is 0.0782934382668385
normalized_mutual_information = 0.393845332589924
RI = 0.9808810753077731
F_1 = 0.0782934382668385
The NN is 0.16188
The FT is 0.09827
The ST is 0.16021
The DCG is 0.47963
The E is 0.08000
The MAP 0.07771
2018-10-22 11:28:11.047908: Epoch [ 31/1000] [ 20/183], total loss: 0.00161, regularization loss: 0.29021, contrastive loss: 0.00161, Loss positive: 0.00000, Loss negative: 0.00161
2018-10-22 11:28:21.082197: Epoch [ 31/1000] [ 40/183], total loss: 0.00040, regularization loss: 0.29021, contrastive loss: 0.00040, Loss positive: 0.00000, Loss negative: 0.00040
2018-10-22 11:28:31.158022: Epoch [ 31/1000] [ 60/183], total loss: 0.00004, regularization loss: 0.29021, contrastive loss: 0.00004, Loss positive: 0.00000, Loss negative: 0.00004
2018-10-22 11:28:41.236767: Epoch [ 31/1000] [ 80/183], total loss: 0.08701, regularization loss: 0.29021, contrastive loss: 0.08701, Loss positive: 0.08455, Loss negative: 0.00247
2018-10-22 11:28:51.367895: Epoch [ 31/1000] [100/183], total loss: 0.00197, regularization loss: 0.29021, contrastive loss: 0.00197, Loss positive: 0.00000, Loss negative: 0.00197
2018-10-22 11:29:01.504430: Epoch [ 31/1000] [120/183], total loss: 0.00192, regularization loss: 0.29021, contrastive loss: 0.00192, Loss positive: 0.00000, Loss negative: 0.00192
2018-10-22 11:29:11.698919: Epoch [ 31/1000] [140/183], total loss: 0.00167, regularization loss: 0.29021, contrastive loss: 0.00167, Loss positive: 0.00000, Loss negative: 0.00167
2018-10-22 11:29:21.948042: Epoch [ 31/1000] [160/183], total loss: 0.03009, regularization loss: 0.29021, contrastive loss: 0.03009, Loss positive: 0.02880, Loss negative: 0.00129
2018-10-22 11:29:32.156384: Epoch [ 31/1000] [180/183], total loss: 0.14550, regularization loss: 0.29021, contrastive loss: 0.14550, Loss positive: 0.14385, Loss negative: 0.00165
2018-10-22 11:29:53.767834: Epoch [ 32/1000] [ 20/183], total loss: 0.00295, regularization loss: 0.29021, contrastive loss: 0.00295, Loss positive: 0.00000, Loss negative: 0.00295
2018-10-22 11:30:03.872887: Epoch [ 32/1000] [ 40/183], total loss: 0.00596, regularization loss: 0.29021, contrastive loss: 0.00596, Loss positive: 0.00000, Loss negative: 0.00596
2018-10-22 11:30:13.970194: Epoch [ 32/1000] [ 60/183], total loss: 0.01928, regularization loss: 0.29021, contrastive loss: 0.01928, Loss positive: 0.01473, Loss negative: 0.00456
2018-10-22 11:30:24.085430: Epoch [ 32/1000] [ 80/183], total loss: 0.00164, regularization loss: 0.29021, contrastive loss: 0.00164, Loss positive: 0.00000, Loss negative: 0.00164
2018-10-22 11:30:34.301650: Epoch [ 32/1000] [100/183], total loss: 0.00244, regularization loss: 0.29021, contrastive loss: 0.00244, Loss positive: 0.00000, Loss negative: 0.00244
2018-10-22 11:30:44.746721: Epoch [ 32/1000] [120/183], total loss: 0.00160, regularization loss: 0.29021, contrastive loss: 0.00160, Loss positive: 0.00000, Loss negative: 0.00160
2018-10-22 11:30:55.111486: Epoch [ 32/1000] [140/183], total loss: 0.00347, regularization loss: 0.29021, contrastive loss: 0.00347, Loss positive: 0.00000, Loss negative: 0.00347
2018-10-22 11:31:05.352343: Epoch [ 32/1000] [160/183], total loss: 0.00524, regularization loss: 0.29021, contrastive loss: 0.00524, Loss positive: 0.00000, Loss negative: 0.00524
2018-10-22 11:31:15.581534: Epoch [ 32/1000] [180/183], total loss: 0.07169, regularization loss: 0.29021, contrastive loss: 0.07169, Loss positive: 0.07013, Loss negative: 0.00155
2018-10-22 11:31:37.052606: Epoch [ 33/1000] [ 20/183], total loss: 0.00378, regularization loss: 0.29021, contrastive loss: 0.00378, Loss positive: 0.00000, Loss negative: 0.00378
2018-10-22 11:31:47.181214: Epoch [ 33/1000] [ 40/183], total loss: 0.00448, regularization loss: 0.29021, contrastive loss: 0.00448, Loss positive: 0.00000, Loss negative: 0.00448
2018-10-22 11:31:57.310050: Epoch [ 33/1000] [ 60/183], total loss: 0.03699, regularization loss: 0.29021, contrastive loss: 0.03699, Loss positive: 0.03616, Loss negative: 0.00083
2018-10-22 11:32:07.511557: Epoch [ 33/1000] [ 80/183], total loss: 0.03685, regularization loss: 0.29021, contrastive loss: 0.03685, Loss positive: 0.02567, Loss negative: 0.01118
2018-10-22 11:32:17.762190: Epoch [ 33/1000] [100/183], total loss: 0.00058, regularization loss: 0.29021, contrastive loss: 0.00058, Loss positive: 0.00000, Loss negative: 0.00058
2018-10-22 11:32:27.986668: Epoch [ 33/1000] [120/183], total loss: 0.00022, regularization loss: 0.29021, contrastive loss: 0.00022, Loss positive: 0.00000, Loss negative: 0.00022
2018-10-22 11:32:38.330186: Epoch [ 33/1000] [140/183], total loss: 0.10579, regularization loss: 0.29021, contrastive loss: 0.10579, Loss positive: 0.10491, Loss negative: 0.00088
2018-10-22 11:32:48.609470: Epoch [ 33/1000] [160/183], total loss: 0.00004, regularization loss: 0.29021, contrastive loss: 0.00004, Loss positive: 0.00000, Loss negative: 0.00004
2018-10-22 11:32:58.799134: Epoch [ 33/1000] [180/183], total loss: 0.00216, regularization loss: 0.29021, contrastive loss: 0.00216, Loss positive: 0.00000, Loss negative: 0.00216
2018-10-22 11:33:20.259454: Epoch [ 34/1000] [ 20/183], total loss: 0.02216, regularization loss: 0.29021, contrastive loss: 0.02216, Loss positive: 0.02125, Loss negative: 0.00091
2018-10-22 11:33:30.398767: Epoch [ 34/1000] [ 40/183], total loss: 0.02725, regularization loss: 0.29021, contrastive loss: 0.02725, Loss positive: 0.02502, Loss negative: 0.00223
2018-10-22 11:33:40.516233: Epoch [ 34/1000] [ 60/183], total loss: 0.02218, regularization loss: 0.29021, contrastive loss: 0.02218, Loss positive: 0.02212, Loss negative: 0.00006
2018-10-22 11:33:50.717122: Epoch [ 34/1000] [ 80/183], total loss: 0.02106, regularization loss: 0.29021, contrastive loss: 0.02106, Loss positive: 0.02018, Loss negative: 0.00088
2018-10-22 11:34:00.990170: Epoch [ 34/1000] [100/183], total loss: 0.00062, regularization loss: 0.29021, contrastive loss: 0.00062, Loss positive: 0.00000, Loss negative: 0.00062
2018-10-22 11:34:11.231992: Epoch [ 34/1000] [120/183], total loss: 0.00270, regularization loss: 0.29021, contrastive loss: 0.00270, Loss positive: 0.00000, Loss negative: 0.00270
2018-10-22 11:34:21.435026: Epoch [ 34/1000] [140/183], total loss: 0.00485, regularization loss: 0.29021, contrastive loss: 0.00485, Loss positive: 0.00000, Loss negative: 0.00485
2018-10-22 11:34:31.681482: Epoch [ 34/1000] [160/183], total loss: 0.00278, regularization loss: 0.29021, contrastive loss: 0.00278, Loss positive: 0.00000, Loss negative: 0.00278
2018-10-22 11:34:41.928998: Epoch [ 34/1000] [180/183], total loss: 0.00164, regularization loss: 0.29021, contrastive loss: 0.00164, Loss positive: 0.00000, Loss negative: 0.00164
2018-10-22 11:35:02.461954: Epoch [ 35/1000] [ 20/183], total loss: 0.00311, regularization loss: 0.29021, contrastive loss: 0.00311, Loss positive: 0.00000, Loss negative: 0.00311
2018-10-22 11:35:12.579465: Epoch [ 35/1000] [ 40/183], total loss: 0.04578, regularization loss: 0.29021, contrastive loss: 0.04578, Loss positive: 0.04342, Loss negative: 0.00236
2018-10-22 11:35:22.722859: Epoch [ 35/1000] [ 60/183], total loss: 0.04131, regularization loss: 0.29021, contrastive loss: 0.04131, Loss positive: 0.03942, Loss negative: 0.00189
2018-10-22 11:35:32.968145: Epoch [ 35/1000] [ 80/183], total loss: 0.00111, regularization loss: 0.29021, contrastive loss: 0.00111, Loss positive: 0.00000, Loss negative: 0.00111
2018-10-22 11:35:43.155333: Epoch [ 35/1000] [100/183], total loss: 0.00115, regularization loss: 0.29021, contrastive loss: 0.00115, Loss positive: 0.00000, Loss negative: 0.00115
2018-10-22 11:35:53.417655: Epoch [ 35/1000] [120/183], total loss: 0.00761, regularization loss: 0.29021, contrastive loss: 0.00761, Loss positive: 0.00000, Loss negative: 0.00761
2018-10-22 11:36:03.636266: Epoch [ 35/1000] [140/183], total loss: 0.03135, regularization loss: 0.29021, contrastive loss: 0.03135, Loss positive: 0.02637, Loss negative: 0.00498
2018-10-22 11:36:13.894668: Epoch [ 35/1000] [160/183], total loss: 0.00426, regularization loss: 0.29021, contrastive loss: 0.00426, Loss positive: 0.00000, Loss negative: 0.00426
2018-10-22 11:36:24.160369: Epoch [ 35/1000] [180/183], total loss: 0.04302, regularization loss: 0.29021, contrastive loss: 0.04302, Loss positive: 0.04195, Loss negative: 0.00107
2018-10-22 11:36:45.016834: Epoch [ 36/1000] [ 20/183], total loss: 0.00059, regularization loss: 0.29021, contrastive loss: 0.00059, Loss positive: 0.00000, Loss negative: 0.00059
2018-10-22 11:36:55.163393: Epoch [ 36/1000] [ 40/183], total loss: 0.00178, regularization loss: 0.29021, contrastive loss: 0.00178, Loss positive: 0.00000, Loss negative: 0.00178
2018-10-22 11:37:05.316545: Epoch [ 36/1000] [ 60/183], total loss: 0.00229, regularization loss: 0.29021, contrastive loss: 0.00229, Loss positive: 0.00000, Loss negative: 0.00229
2018-10-22 11:37:15.623133: Epoch [ 36/1000] [ 80/183], total loss: 0.02048, regularization loss: 0.29021, contrastive loss: 0.02048, Loss positive: 0.01398, Loss negative: 0.00650
2018-10-22 11:37:25.871079: Epoch [ 36/1000] [100/183], total loss: 0.01720, regularization loss: 0.29021, contrastive loss: 0.01720, Loss positive: 0.01599, Loss negative: 0.00121
2018-10-22 11:37:36.115479: Epoch [ 36/1000] [120/183], total loss: 0.00087, regularization loss: 0.29021, contrastive loss: 0.00087, Loss positive: 0.00000, Loss negative: 0.00087
2018-10-22 11:37:46.403578: Epoch [ 36/1000] [140/183], total loss: 0.00248, regularization loss: 0.29021, contrastive loss: 0.00248, Loss positive: 0.00000, Loss negative: 0.00248
2018-10-22 11:37:56.571907: Epoch [ 36/1000] [160/183], total loss: 0.02848, regularization loss: 0.29021, contrastive loss: 0.02848, Loss positive: 0.02683, Loss negative: 0.00165
2018-10-22 11:38:06.810593: Epoch [ 36/1000] [180/183], total loss: 0.05841, regularization loss: 0.29021, contrastive loss: 0.05841, Loss positive: 0.05823, Loss negative: 0.00018
2018-10-22 11:38:27.354475: Epoch [ 37/1000] [ 20/183], total loss: 0.02204, regularization loss: 0.29021, contrastive loss: 0.02204, Loss positive: 0.01781, Loss negative: 0.00422
2018-10-22 11:38:37.462821: Epoch [ 37/1000] [ 40/183], total loss: 0.00000, regularization loss: 0.29021, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:38:47.617006: Epoch [ 37/1000] [ 60/183], total loss: 0.00007, regularization loss: 0.29021, contrastive loss: 0.00007, Loss positive: 0.00000, Loss negative: 0.00007
2018-10-22 11:38:57.807671: Epoch [ 37/1000] [ 80/183], total loss: 0.00094, regularization loss: 0.29021, contrastive loss: 0.00094, Loss positive: 0.00000, Loss negative: 0.00094
2018-10-22 11:39:08.065974: Epoch [ 37/1000] [100/183], total loss: 0.00107, regularization loss: 0.29021, contrastive loss: 0.00107, Loss positive: 0.00000, Loss negative: 0.00107
2018-10-22 11:39:18.340588: Epoch [ 37/1000] [120/183], total loss: 0.07864, regularization loss: 0.29020, contrastive loss: 0.07864, Loss positive: 0.07768, Loss negative: 0.00096
2018-10-22 11:39:28.548747: Epoch [ 37/1000] [140/183], total loss: 0.00243, regularization loss: 0.29020, contrastive loss: 0.00243, Loss positive: 0.00000, Loss negative: 0.00243
2018-10-22 11:39:38.895591: Epoch [ 37/1000] [160/183], total loss: 0.01007, regularization loss: 0.29020, contrastive loss: 0.01007, Loss positive: 0.00000, Loss negative: 0.01007
2018-10-22 11:39:49.150576: Epoch [ 37/1000] [180/183], total loss: 0.00292, regularization loss: 0.29020, contrastive loss: 0.00292, Loss positive: 0.00000, Loss negative: 0.00292
2018-10-22 11:40:09.813223: Epoch [ 38/1000] [ 20/183], total loss: 0.04127, regularization loss: 0.29020, contrastive loss: 0.04127, Loss positive: 0.03654, Loss negative: 0.00473
2018-10-22 11:40:19.966464: Epoch [ 38/1000] [ 40/183], total loss: 0.00508, regularization loss: 0.29020, contrastive loss: 0.00508, Loss positive: 0.00000, Loss negative: 0.00508
2018-10-22 11:40:30.063796: Epoch [ 38/1000] [ 60/183], total loss: 0.00255, regularization loss: 0.29020, contrastive loss: 0.00255, Loss positive: 0.00000, Loss negative: 0.00255
2018-10-22 11:40:40.219298: Epoch [ 38/1000] [ 80/183], total loss: 0.00548, regularization loss: 0.29020, contrastive loss: 0.00548, Loss positive: 0.00000, Loss negative: 0.00548
2018-10-22 11:40:50.574461: Epoch [ 38/1000] [100/183], total loss: 0.00079, regularization loss: 0.29020, contrastive loss: 0.00079, Loss positive: 0.00000, Loss negative: 0.00079
2018-10-22 11:41:00.725894: Epoch [ 38/1000] [120/183], total loss: 0.00000, regularization loss: 0.29020, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:41:10.948396: Epoch [ 38/1000] [140/183], total loss: 0.00539, regularization loss: 0.29020, contrastive loss: 0.00539, Loss positive: 0.00000, Loss negative: 0.00539
2018-10-22 11:41:21.208346: Epoch [ 38/1000] [160/183], total loss: 0.02101, regularization loss: 0.29021, contrastive loss: 0.02101, Loss positive: 0.02093, Loss negative: 0.00008
2018-10-22 11:41:31.462673: Epoch [ 38/1000] [180/183], total loss: 0.09158, regularization loss: 0.29021, contrastive loss: 0.09158, Loss positive: 0.09103, Loss negative: 0.00055
2018-10-22 11:41:52.846288: Epoch [ 39/1000] [ 20/183], total loss: 0.00012, regularization loss: 0.29020, contrastive loss: 0.00012, Loss positive: 0.00000, Loss negative: 0.00012
2018-10-22 11:42:02.960028: Epoch [ 39/1000] [ 40/183], total loss: 0.00009, regularization loss: 0.29020, contrastive loss: 0.00009, Loss positive: 0.00000, Loss negative: 0.00009
2018-10-22 11:42:13.093694: Epoch [ 39/1000] [ 60/183], total loss: 0.01590, regularization loss: 0.29020, contrastive loss: 0.01590, Loss positive: 0.01590, Loss negative: 0.00000
2018-10-22 11:42:23.236637: Epoch [ 39/1000] [ 80/183], total loss: 0.00009, regularization loss: 0.29020, contrastive loss: 0.00009, Loss positive: 0.00000, Loss negative: 0.00009
2018-10-22 11:42:33.502552: Epoch [ 39/1000] [100/183], total loss: 0.00207, regularization loss: 0.29020, contrastive loss: 0.00207, Loss positive: 0.00000, Loss negative: 0.00207
2018-10-22 11:42:43.729774: Epoch [ 39/1000] [120/183], total loss: 0.00910, regularization loss: 0.29020, contrastive loss: 0.00910, Loss positive: 0.00000, Loss negative: 0.00910
2018-10-22 11:42:54.077588: Epoch [ 39/1000] [140/183], total loss: 0.00150, regularization loss: 0.29020, contrastive loss: 0.00150, Loss positive: 0.00000, Loss negative: 0.00150
2018-10-22 11:43:04.300654: Epoch [ 39/1000] [160/183], total loss: 0.00284, regularization loss: 0.29020, contrastive loss: 0.00284, Loss positive: 0.00000, Loss negative: 0.00284
2018-10-22 11:43:14.568377: Epoch [ 39/1000] [180/183], total loss: 0.01890, regularization loss: 0.29020, contrastive loss: 0.01890, Loss positive: 0.01810, Loss negative: 0.00080
2018-10-22 11:43:36.154064: Epoch [ 40/1000] [ 20/183], total loss: 0.00005, regularization loss: 0.29021, contrastive loss: 0.00005, Loss positive: 0.00000, Loss negative: 0.00005
2018-10-22 11:43:46.269004: Epoch [ 40/1000] [ 40/183], total loss: 0.00270, regularization loss: 0.29021, contrastive loss: 0.00270, Loss positive: 0.00000, Loss negative: 0.00270
2018-10-22 11:43:56.425187: Epoch [ 40/1000] [ 60/183], total loss: 0.00173, regularization loss: 0.29021, contrastive loss: 0.00173, Loss positive: 0.00000, Loss negative: 0.00173
2018-10-22 11:44:06.567853: Epoch [ 40/1000] [ 80/183], total loss: 0.00025, regularization loss: 0.29021, contrastive loss: 0.00025, Loss positive: 0.00000, Loss negative: 0.00025
2018-10-22 11:44:16.831539: Epoch [ 40/1000] [100/183], total loss: 0.00000, regularization loss: 0.29021, contrastive loss: 0.00000, Loss positive: 0.00000, Loss negative: 0.00000
2018-10-22 11:44:27.052450: Epoch [ 40/1000] [120/183], total loss: 0.00066, regularization loss: 0.29021, contrastive loss: 0.00066, Loss positive: 0.00000, Loss negative: 0.00066
2018-10-22 11:44:37.422173: Epoch [ 40/1000] [140/183], total loss: 0.00060, regularization loss: 0.29021, contrastive loss: 0.00060, Loss positive: 0.00000, Loss negative: 0.00060
2018-10-22 11:44:47.630745: Epoch [ 40/1000] [160/183], total loss: 0.00063, regularization loss: 0.29020, contrastive loss: 0.00063, Loss positive: 0.00000, Loss negative: 0.00063
2018-10-22 11:44:57.912207: Epoch [ 40/1000] [180/183], total loss: 0.00067, regularization loss: 0.29020, contrastive loss: 0.00067, Loss positive: 0.00000, Loss negative: 0.00067
Recall@1: 0.17353
Recall@2: 0.26958
Recall@4: 0.39619
Recall@8: 0.53815
Recall@16: 0.68332
Recall@32: 0.81212
In side function n_clusters = 100
sampler_num_per_class = [50. 60. 60. 60. 49. 60. 59. 60. 60. 60. 60. 60. 50. 60. 59. 60. 59. 60.
59. 60. 60. 60. 60. 59. 59. 59. 60. 60. 60. 60. 60. 60. 60. 60. 59. 60.
60. 60. 60. 60. 58. 60. 60. 60. 60. 60. 60. 60. 59. 60. 51. 60. 59. 60.
60. 60. 59. 60. 60. 59. 60. 60. 60. 60. 60. 59. 60. 59. 59. 60. 60. 60.
60. 60. 60. 60. 60. 56. 59. 60. 59. 60. 60. 60. 60. 60. 50. 60. 60. 58.
60. 60. 60. 60. 60. 59. 60. 60. 60. 60.]
sampler_num_per_cluster = [ 78. 74. 48. 51. 83. 54. 69. 70. 76. 77. 70. 55. 74. 67.
27. 36. 91. 45. 50. 30. 36. 36. 41. 47. 43. 48. 59. 68.
20. 89. 67. 60. 97. 38. 47. 39. 53. 40. 45. 51. 66. 75.
73. 50. 56. 70. 47. 51. 58. 99. 100. 43. 40. 46. 49. 74.
50. 77. 67. 63. 71. 48. 66. 35. 46. 61. 45. 76. 70. 68.
77. 73. 77. 61. 64. 67. 85. 89. 40. 62. 21. 60. 59. 40.
43. 63. 55. 44. 68. 40. 81. 65. 68. 74. 91. 39. 61. 80.
56. 32.]
Purity is 0.189
count_cross = [[0. 0. 0. ... 0. 0. 2.]
[0. 0. 0. ... 1. 0. 0.]
[1. 0. 0. ... 0. 0. 0.]
...
[0. 0. 0. ... 0. 0. 3.]
[0. 0. 0. ... 0. 0. 5.]
[0. 0. 3. ... 1. 0. 0.]]
Mutual information is 1.81630
5924.0
5924
Entropy cluster is 4.56052
Entropy class is 4.60444
normalized_mutual_information is 0.39636
tp_and_fp = 187758.0
tp = 14381.0
fp is 173377.0
fn is 158369.0
RI is 0.9810905495155418
Precision is 0.0765932743212007
Recall is 0.08324746743849494
F_1 is 0.07978186337057708
normalized_mutual_information = 0.39635791404635673
RI = 0.9810905495155418
F_1 = 0.07978186337057708
The NN is 0.17353
The FT is 0.09831
The ST is 0.16014
The DCG is 0.48022
The E is 0.08087
The MAP 0.07657
2018-10-22 11:46:23.233162: Epoch [ 41/1000] [ 20/183], total loss: 0.00266, regularization loss: 0.29020, contrastive loss: 0.00266, Loss positive: 0.00000, Loss negative: 0.00266
2018-10-22 11:46:33.310291: Epoch [ 41/1000] [ 40/183], total loss: 0.00460, regularization loss: 0.29020, contrastive loss: 0.00460, Loss positive: 0.00000, Loss negative: 0.00460
2018-10-22 11:46:43.370074: Epoch [ 41/1000] [ 60/183], total loss: 0.07920, regularization loss: 0.29020, contrastive loss: 0.07920, Loss positive: 0.07919, Loss negative: 0.00001
2018-10-22 11:46:53.472674: Epoch [ 41/1000] [ 80/183], total loss: 0.05574, regularization loss: 0.29020, contrastive loss: 0.05574, Loss positive: 0.05563, Loss negative: 0.00011
2018-10-22 11:47:03.552865: Epoch [ 41/1000] [100/183], total loss: 0.02902, regularization loss: 0.29020, contrastive loss: 0.02902, Loss positive: 0.02595, Loss negative: 0.00307
2018-10-22 11:47:13.689306: Epoch [ 41/1000] [120/183], total loss: 0.00080, regularization loss: 0.29020, contrastive loss: 0.00080, Loss positive: 0.00000, Loss negative: 0.00080
2018-10-22 11:47:23.834249: Epoch [ 41/1000] [140/183], total loss: 0.00080, regularization loss: 0.29020, contrastive loss: 0.00080, Loss positive: 0.00000, Loss negative: 0.00080
2018-10-22 11:47:34.046530: Epoch [ 41/1000] [160/183], total loss: 0.03833, regularization loss: 0.29020, contrastive loss: 0.03833, Loss positive: 0.03603, Loss negative: 0.00230
2018-10-22 11:47:44.272401: Epoch [ 41/1000] [180/183], total loss: 0.00097, regularization loss: 0.29020, contrastive loss: 0.00097, Loss positive: 0.00000, Loss negative: 0.00097