文件名称:Deep Learning With Python-Jason Brownlee(2017)_Code
文件大小:4.09MB
文件格式:RAR
更新时间:2021-02-26 14:20:16
python;Deep learning
Jason Brownlee的大作,介绍了Keras的使用,是将来的方向
【文件预览】:
code
----chapter_22()
--------imdb_plot.py(819B)
--------imdb_mlp.py(1KB)
--------imdb_cnn.py(1KB)
----chapter_10()
--------iris.csv(4KB)
--------iris_example.py(1KB)
----chapter_25()
--------lstm_stacked.py(3KB)
--------lstm_stateful.py(3KB)
--------international-airline-passengers.csv(2KB)
--------lstm_window.py(3KB)
--------lstm_simple.py(3KB)
--------lstm_time_steps.py(3KB)
----chapter_20()
--------augment_feature_standardize.py(1KB)
--------augment_baseline.py(344B)
--------augment_zca.py(967B)
--------augment_shifts.py(1009B)
--------augment_save_to_file.py(1KB)
--------augment_rotations.py(968B)
--------augment_flips.py(987B)
----chapter_19()
--------mnist_cnn.py(2KB)
--------mnist_plot.py(525B)
--------mnist_cnn_large.py(2KB)
--------mnist_mlp_baseline.py(1KB)
----chapter_27()
--------lstm_var_length.py(2KB)
--------lstm_char_seq_features.py(2KB)
--------lstm_char_seq_timesteps.py(2KB)
--------lstm_one_char_stateful.py(2KB)
--------lstm_char_seq_batch.py(2KB)
--------lstm_one_char.py(2KB)
----chapter_17()
--------decay_time_based.py(1KB)
--------decay_drop_based.py(1KB)
--------ionosphere.csv(75KB)
----chapter_24()
--------mlp_window.py(2KB)
--------international-airline-passengers.csv(2KB)
--------mlp_simple.py(2KB)
----chapter_16()
--------dropout_visible.py(2KB)
--------sonar.csv(86KB)
--------dropout_hidden.py(2KB)
--------baseline.py(2KB)
----chapter_21()
--------cifar10_plot.py(345B)
--------cifar10_cnn.py(2KB)
--------cifar10_cnn_large.py(2KB)
----chapter_13()
--------serialize_yaml.py(2KB)
--------serialize_json.py(2KB)
--------pima-indians-diabetes.csv(23KB)
----chapter_11()
--------sonar.csv(86KB)
--------sonar_baseline.py(1KB)
--------sonar_standardized.py(2KB)
--------sonar_standardized_smaller.py(2KB)
--------sonar_standardized_larger.py(2KB)
----chapter_07()
--------pima-indians-diabetes.csv(23KB)
--------first_mlp.py(793B)
----chapter_26()
--------lstm_dropout_gates.py(1KB)
--------lstm_dropout_layers.py(1KB)
--------lstm_cnn.py(1KB)
--------lstm_simple.py(1KB)
----chapter_28()
--------weights-improvement-19-1.9435.hdf5(1.06MB)
--------weights-improvement-47-1.2219-bigger.hdf5(3.07MB)
--------lstm_larger_gen_text.py(2KB)
--------lstm_small.py(2KB)
--------wonderland.txt(144KB)
--------lstm_gen_text.py(2KB)
--------lstm_larger.py(2KB)
----chapter_09()
--------sklearn_grid_search_params.py(2KB)
--------pima-indians-diabetes.csv(23KB)
--------sklearn_cross_validation.py(1KB)
----chapter_03()
--------tensorflow_example.py(384B)
----chapter_14()
--------checkpoint_best_model.py(1KB)
--------checkpoint_load.py(1KB)
--------pima-indians-diabetes.csv(23KB)
--------checkpoint_model_improvements.py(1KB)
----chapter_08()
--------manual_split.py(925B)
--------manual_cross_validation.py(1KB)
--------pima-indians-diabetes.csv(23KB)
--------automatic_split.py(704B)
----chapter_02()
--------theano_example.py(396B)
----chapter_15()
--------plot_history.py(1KB)
--------pima-indians-diabetes.csv(23KB)
----chapter_12()
--------boston_standardized.py(1KB)
--------housing.csv(48KB)
--------boston_baseline.py(1KB)
--------boston_standardized_larger.py(1KB)
--------boston_standardized_wider.py(1KB)