-
Notifications
You must be signed in to change notification settings - Fork 179
Open
Description
This can be hard to figure out since I can't share the data. I am running it in Google Colab.
automl_grid_search(csv_path='/content/CLT_all_tasks_trial_level.csv', target_field='correctResp', model_name='tpu', tpu_address = tpu_address)
Solving a binary_classification problem, maximizing accuracy using tensorflow.
Modeling with field specifications:
Subject: categorical
Finished: categorical
TrainingDay: categorical
Condition: categorical
CondPrev: categorical
TaskNumber: categorical
TaskId: categorical
TrialNumber: numeric
PresentationStimulus: numeric
StimTime: numeric
RespToTime: numeric
RT: numeric
SubjResp: categorical
OutcomeInt: categorical
TaskOutcomeInt: categorical
StimDim1: categorical
StimDim2: categorical
StimDim3: categorical
StimDim4: categorical
IntendedRule: categorical
Background: categorical
StimDimWord1: categorical
StimDimWord2: categorical
StimDimWord3: categorical
StimDimWord4: categorical
ExpResp: categorical
DistinctDays: categorical
out: categorical
StimType: categorical
0% 0/100 [00:00<?, ?trial/s]
0% 0/20 [00:00<?, ?epoch/s]
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-16-ca69e1157d4e> in <module>()
2 target_field='correctResp',
3 model_name='tpu',
----> 4 tpu_address = tpu_address)
/usr/local/lib/python3.6/dist-packages/automl_gs/automl_gs.py in automl_grid_search(csv_path, target_field, target_metric, framework, model_name, context, num_trials, split, num_epochs, col_types, gpu, tpu_address)
92 header=(best_result is None))
93
---> 94 train_results = results.tail(1).to_dict('records')[0]
95
96 # If the target metric improves, save the new hps/files,
IndexError: list index out of range
Here is the log.
Apr 3, 2019, 5:31:44 PM | WARNING | ValueError: Input contains NaN, infinity or a value too large for dtype('float32').
-- | -- | --
Apr 3, 2019, 5:31:44 PM | WARNING | raise ValueError(msg_err.format(type_err, X.dtype))
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py", line 56, in _assert_all_finite
Apr 3, 2019, 5:31:44 PM | WARNING | allow_nan=force_all_finite == 'allow-nan')
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py", line 573, in check_array
Apr 3, 2019, 5:31:44 PM | WARNING | y_pred = check_array(y_pred, ensure_2d=False)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/sklearn/metrics/classification.py", line 1763, in log_loss
Apr 3, 2019, 5:31:44 PM | WARNING | logloss = log_loss(y_true, y_pred)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/content/tpu_train/pipeline.py", line 1126, in on_epoch_end
Apr 3, 2019, 5:31:44 PM | WARNING | callback.on_epoch_end(epoch, logs)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/callbacks.py", line 251, in on_epoch_end
Apr 3, 2019, 5:31:44 PM | WARNING | callbacks.on_epoch_end(epoch, epoch_logs)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/tensorflow/contrib/tpu/python/tpu/keras_support.py", line 1734, in _pipeline_fit_loop
Apr 3, 2019, 5:31:44 PM | WARNING | validation_steps=validation_steps)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/tensorflow/contrib/tpu/python/tpu/keras_support.py", line 1633, in _pipeline_fit
Apr 3, 2019, 5:31:44 PM | WARNING | steps_per_epoch, validation_steps, **kwargs)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/usr/local/lib/python3.6/dist-packages/tensorflow/contrib/tpu/python/tpu/keras_support.py", line 1532, in fit
Apr 3, 2019, 5:31:44 PM | WARNING | batch_size=64 * 8)
Apr 3, 2019, 5:31:44 PM | WARNING | File "/content/tpu_train/pipeline.py", line 1095, in model_train
Apr 3, 2019, 5:31:44 PM | WARNING | model_train(df, encoders, args, model)
Apr 3, 2019, 5:31:44 PM | WARNING | File "model.py", line 69, in <module>
Apr 3, 2019, 5:31:44 PM | WARNING | Traceback (most recent call last):
AFAIK, the largest number in the dataset is 12007245.
Thanks for the help!
Metadata
Metadata
Assignees
Labels
No labels